Sep 30 20:15:06 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Sep 30 20:15:06 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Sep 30 20:15:06 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:06 localhost kernel: BIOS-provided physical RAM map:
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Sep 30 20:15:06 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Sep 30 20:15:06 localhost kernel: NX (Execute Disable) protection: active
Sep 30 20:15:06 localhost kernel: APIC: Static calls initialized
Sep 30 20:15:06 localhost kernel: SMBIOS 2.8 present.
Sep 30 20:15:06 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Sep 30 20:15:06 localhost kernel: Hypervisor detected: KVM
Sep 30 20:15:06 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Sep 30 20:15:06 localhost kernel: kvm-clock: using sched offset of 4504160270 cycles
Sep 30 20:15:06 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Sep 30 20:15:06 localhost kernel: tsc: Detected 2800.000 MHz processor
Sep 30 20:15:06 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Sep 30 20:15:06 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Sep 30 20:15:06 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Sep 30 20:15:06 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Sep 30 20:15:06 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Sep 30 20:15:06 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Sep 30 20:15:06 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Sep 30 20:15:06 localhost kernel: Using GB pages for direct mapping
Sep 30 20:15:06 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Sep 30 20:15:06 localhost kernel: ACPI: Early table checksum verification disabled
Sep 30 20:15:06 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Sep 30 20:15:06 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Sep 30 20:15:06 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 20:15:06 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Sep 30 20:15:06 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Sep 30 20:15:06 localhost kernel: No NUMA configuration found
Sep 30 20:15:06 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Sep 30 20:15:06 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Sep 30 20:15:06 localhost kernel: Zone ranges:
Sep 30 20:15:06 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Sep 30 20:15:06 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Sep 30 20:15:06 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel:   Device   empty
Sep 30 20:15:06 localhost kernel: Movable zone start for each node
Sep 30 20:15:06 localhost kernel: Early memory node ranges
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Sep 30 20:15:06 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Sep 30 20:15:06 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Sep 30 20:15:06 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Sep 30 20:15:06 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Sep 30 20:15:06 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Sep 30 20:15:06 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Sep 30 20:15:06 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Sep 30 20:15:06 localhost kernel: TSC deadline timer available
Sep 30 20:15:06 localhost kernel: CPU topo: Max. logical packages:   8
Sep 30 20:15:06 localhost kernel: CPU topo: Max. logical dies:       8
Sep 30 20:15:06 localhost kernel: CPU topo: Max. dies per package:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Max. threads per core:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Num. cores per package:     1
Sep 30 20:15:06 localhost kernel: CPU topo: Num. threads per package:   1
Sep 30 20:15:06 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Sep 30 20:15:06 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Sep 30 20:15:06 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Sep 30 20:15:06 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Sep 30 20:15:06 localhost kernel: Booting paravirtualized kernel on KVM
Sep 30 20:15:06 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Sep 30 20:15:06 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Sep 30 20:15:06 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Sep 30 20:15:06 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Sep 30 20:15:06 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Sep 30 20:15:06 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Sep 30 20:15:06 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:06 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Sep 30 20:15:06 localhost kernel: random: crng init done
Sep 30 20:15:06 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Sep 30 20:15:06 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Sep 30 20:15:06 localhost kernel: Fallback order for Node 0: 0 
Sep 30 20:15:06 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Sep 30 20:15:06 localhost kernel: Policy zone: Normal
Sep 30 20:15:06 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Sep 30 20:15:06 localhost kernel: software IO TLB: area num 8.
Sep 30 20:15:06 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Sep 30 20:15:06 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Sep 30 20:15:06 localhost kernel: ftrace: allocated 193 pages with 3 groups
Sep 30 20:15:06 localhost kernel: Dynamic Preempt: voluntary
Sep 30 20:15:06 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Sep 30 20:15:06 localhost kernel: rcu:         RCU event tracing is enabled.
Sep 30 20:15:06 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Sep 30 20:15:06 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel:         Rude variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel:         Tracing variant of Tasks RCU enabled.
Sep 30 20:15:06 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Sep 30 20:15:06 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Sep 30 20:15:06 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 20:15:06 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Sep 30 20:15:06 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Sep 30 20:15:06 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Sep 30 20:15:06 localhost kernel: Console: colour VGA+ 80x25
Sep 30 20:15:06 localhost kernel: printk: console [ttyS0] enabled
Sep 30 20:15:06 localhost kernel: ACPI: Core revision 20230331
Sep 30 20:15:06 localhost kernel: APIC: Switch to symmetric I/O mode setup
Sep 30 20:15:06 localhost kernel: x2apic enabled
Sep 30 20:15:06 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Sep 30 20:15:06 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Sep 30 20:15:06 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Sep 30 20:15:06 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Sep 30 20:15:06 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Sep 30 20:15:06 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Sep 30 20:15:06 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Sep 30 20:15:06 localhost kernel: Spectre V2 : Mitigation: Retpolines
Sep 30 20:15:06 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Sep 30 20:15:06 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Sep 30 20:15:06 localhost kernel: RETBleed: Mitigation: untrained return thunk
Sep 30 20:15:06 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Sep 30 20:15:06 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Sep 30 20:15:06 localhost kernel: x86/bugs: return thunk changed
Sep 30 20:15:06 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Sep 30 20:15:06 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Sep 30 20:15:06 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Sep 30 20:15:06 localhost kernel: Freeing SMP alternatives memory: 40K
Sep 30 20:15:06 localhost kernel: pid_max: default: 32768 minimum: 301
Sep 30 20:15:06 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Sep 30 20:15:06 localhost kernel: landlock: Up and running.
Sep 30 20:15:06 localhost kernel: Yama: becoming mindful.
Sep 30 20:15:06 localhost kernel: SELinux:  Initializing.
Sep 30 20:15:06 localhost kernel: LSM support for eBPF active
Sep 30 20:15:06 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Sep 30 20:15:06 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Sep 30 20:15:06 localhost kernel: ... version:                0
Sep 30 20:15:06 localhost kernel: ... bit width:              48
Sep 30 20:15:06 localhost kernel: ... generic registers:      6
Sep 30 20:15:06 localhost kernel: ... value mask:             0000ffffffffffff
Sep 30 20:15:06 localhost kernel: ... max period:             00007fffffffffff
Sep 30 20:15:06 localhost kernel: ... fixed-purpose events:   0
Sep 30 20:15:06 localhost kernel: ... event mask:             000000000000003f
Sep 30 20:15:06 localhost kernel: signal: max sigframe size: 1776
Sep 30 20:15:06 localhost kernel: rcu: Hierarchical SRCU implementation.
Sep 30 20:15:06 localhost kernel: rcu:         Max phase no-delay instances is 400.
Sep 30 20:15:06 localhost kernel: smp: Bringing up secondary CPUs ...
Sep 30 20:15:06 localhost kernel: smpboot: x86: Booting SMP configuration:
Sep 30 20:15:06 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Sep 30 20:15:06 localhost kernel: smp: Brought up 1 node, 8 CPUs
Sep 30 20:15:06 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Sep 30 20:15:06 localhost kernel: node 0 deferred pages initialised in 20ms
Sep 30 20:15:06 localhost kernel: Memory: 7765412K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616488K reserved, 0K cma-reserved)
Sep 30 20:15:06 localhost kernel: devtmpfs: initialized
Sep 30 20:15:06 localhost kernel: x86/mm: Memory block size: 128MB
Sep 30 20:15:06 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Sep 30 20:15:06 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: pinctrl core: initialized pinctrl subsystem
Sep 30 20:15:06 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Sep 30 20:15:06 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Sep 30 20:15:06 localhost kernel: audit: initializing netlink subsys (disabled)
Sep 30 20:15:06 localhost kernel: audit: type=2000 audit(1759263304.584:1): state=initialized audit_enabled=0 res=1
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Sep 30 20:15:06 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Sep 30 20:15:06 localhost kernel: cpuidle: using governor menu
Sep 30 20:15:06 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Sep 30 20:15:06 localhost kernel: PCI: Using configuration type 1 for base access
Sep 30 20:15:06 localhost kernel: PCI: Using configuration type 1 for extended access
Sep 30 20:15:06 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Sep 30 20:15:06 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Sep 30 20:15:06 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Sep 30 20:15:06 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Sep 30 20:15:06 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Sep 30 20:15:06 localhost kernel: Demotion targets for Node 0: null
Sep 30 20:15:06 localhost kernel: cryptd: max_cpu_qlen set to 1000
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Module Device)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Processor Device)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Sep 30 20:15:06 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Sep 30 20:15:06 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Sep 30 20:15:06 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Sep 30 20:15:06 localhost kernel: ACPI: Interpreter enabled
Sep 30 20:15:06 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Sep 30 20:15:06 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Sep 30 20:15:06 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Sep 30 20:15:06 localhost kernel: PCI: Using E820 reservations for host bridge windows
Sep 30 20:15:06 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Sep 30 20:15:06 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Sep 30 20:15:06 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [3] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [4] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [5] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [6] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [7] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [8] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [9] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [10] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [11] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [12] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [13] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [14] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [15] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [16] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [17] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [18] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [19] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [20] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [21] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [22] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [23] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [24] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [25] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [26] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [27] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [28] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [29] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [30] registered
Sep 30 20:15:06 localhost kernel: acpiphp: Slot [31] registered
Sep 30 20:15:06 localhost kernel: PCI host bridge to bus 0000:00
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Sep 30 20:15:06 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Sep 30 20:15:06 localhost kernel: iommu: Default domain type: Translated
Sep 30 20:15:06 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Sep 30 20:15:06 localhost kernel: SCSI subsystem initialized
Sep 30 20:15:06 localhost kernel: ACPI: bus type USB registered
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbfs
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver hub
Sep 30 20:15:06 localhost kernel: usbcore: registered new device driver usb
Sep 30 20:15:06 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Sep 30 20:15:06 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Sep 30 20:15:06 localhost kernel: PTP clock support registered
Sep 30 20:15:06 localhost kernel: EDAC MC: Ver: 3.0.0
Sep 30 20:15:06 localhost kernel: NetLabel: Initializing
Sep 30 20:15:06 localhost kernel: NetLabel:  domain hash size = 128
Sep 30 20:15:06 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Sep 30 20:15:06 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Sep 30 20:15:06 localhost kernel: PCI: Using ACPI for IRQ routing
Sep 30 20:15:06 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Sep 30 20:15:06 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Sep 30 20:15:06 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Sep 30 20:15:06 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Sep 30 20:15:06 localhost kernel: vgaarb: loaded
Sep 30 20:15:06 localhost kernel: clocksource: Switched to clocksource kvm-clock
Sep 30 20:15:06 localhost kernel: VFS: Disk quotas dquot_6.6.0
Sep 30 20:15:06 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Sep 30 20:15:06 localhost kernel: pnp: PnP ACPI init
Sep 30 20:15:06 localhost kernel: pnp 00:03: [dma 2]
Sep 30 20:15:06 localhost kernel: pnp: PnP ACPI: found 5 devices
Sep 30 20:15:06 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Sep 30 20:15:06 localhost kernel: NET: Registered PF_INET protocol family
Sep 30 20:15:06 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Sep 30 20:15:06 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Sep 30 20:15:06 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Sep 30 20:15:06 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Sep 30 20:15:06 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Sep 30 20:15:06 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 20:15:06 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Sep 30 20:15:06 localhost kernel: NET: Registered PF_XDP protocol family
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Sep 30 20:15:06 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Sep 30 20:15:06 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Sep 30 20:15:06 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 99638 usecs
Sep 30 20:15:06 localhost kernel: PCI: CLS 0 bytes, default 64
Sep 30 20:15:06 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Sep 30 20:15:06 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Sep 30 20:15:06 localhost kernel: ACPI: bus type thunderbolt registered
Sep 30 20:15:06 localhost kernel: Trying to unpack rootfs image as initramfs...
Sep 30 20:15:06 localhost kernel: Initialise system trusted keyrings
Sep 30 20:15:06 localhost kernel: Key type blacklist registered
Sep 30 20:15:06 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Sep 30 20:15:06 localhost kernel: zbud: loaded
Sep 30 20:15:06 localhost kernel: integrity: Platform Keyring initialized
Sep 30 20:15:06 localhost kernel: integrity: Machine keyring initialized
Sep 30 20:15:06 localhost kernel: Freeing initrd memory: 86080K
Sep 30 20:15:06 localhost kernel: NET: Registered PF_ALG protocol family
Sep 30 20:15:06 localhost kernel: xor: automatically using best checksumming function   avx       
Sep 30 20:15:06 localhost kernel: Key type asymmetric registered
Sep 30 20:15:06 localhost kernel: Asymmetric key parser 'x509' registered
Sep 30 20:15:06 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Sep 30 20:15:06 localhost kernel: io scheduler mq-deadline registered
Sep 30 20:15:06 localhost kernel: io scheduler kyber registered
Sep 30 20:15:06 localhost kernel: io scheduler bfq registered
Sep 30 20:15:06 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Sep 30 20:15:06 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Sep 30 20:15:06 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Sep 30 20:15:06 localhost kernel: ACPI: button: Power Button [PWRF]
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Sep 30 20:15:06 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Sep 30 20:15:06 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Sep 30 20:15:06 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Sep 30 20:15:06 localhost kernel: Non-volatile memory driver v1.3
Sep 30 20:15:06 localhost kernel: rdac: device handler registered
Sep 30 20:15:06 localhost kernel: hp_sw: device handler registered
Sep 30 20:15:06 localhost kernel: emc: device handler registered
Sep 30 20:15:06 localhost kernel: alua: device handler registered
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Sep 30 20:15:06 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Sep 30 20:15:06 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Sep 30 20:15:06 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Sep 30 20:15:06 localhost kernel: usb usb1: Product: UHCI Host Controller
Sep 30 20:15:06 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Sep 30 20:15:06 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Sep 30 20:15:06 localhost kernel: hub 1-0:1.0: USB hub found
Sep 30 20:15:06 localhost kernel: hub 1-0:1.0: 2 ports detected
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbserial_generic
Sep 30 20:15:06 localhost kernel: usbserial: USB Serial support registered for generic
Sep 30 20:15:06 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Sep 30 20:15:06 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Sep 30 20:15:06 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Sep 30 20:15:06 localhost kernel: mousedev: PS/2 mouse device common for all mice
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: registered as rtc0
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-09-30T20:15:05 UTC (1759263305)
Sep 30 20:15:06 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Sep 30 20:15:06 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Sep 30 20:15:06 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Sep 30 20:15:06 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Sep 30 20:15:06 localhost kernel: usbcore: registered new interface driver usbhid
Sep 30 20:15:06 localhost kernel: usbhid: USB HID core driver
Sep 30 20:15:06 localhost kernel: drop_monitor: Initializing network drop monitor service
Sep 30 20:15:06 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Sep 30 20:15:06 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Sep 30 20:15:06 localhost kernel: Initializing XFRM netlink socket
Sep 30 20:15:06 localhost kernel: NET: Registered PF_INET6 protocol family
Sep 30 20:15:06 localhost kernel: Segment Routing with IPv6
Sep 30 20:15:06 localhost kernel: NET: Registered PF_PACKET protocol family
Sep 30 20:15:06 localhost kernel: mpls_gso: MPLS GSO support
Sep 30 20:15:06 localhost kernel: IPI shorthand broadcast: enabled
Sep 30 20:15:06 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Sep 30 20:15:06 localhost kernel: AES CTR mode by8 optimization enabled
Sep 30 20:15:06 localhost kernel: sched_clock: Marking stable (1224001600, 149801530)->(1522455940, -148652810)
Sep 30 20:15:06 localhost kernel: registered taskstats version 1
Sep 30 20:15:06 localhost kernel: Loading compiled-in X.509 certificates
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Sep 30 20:15:06 localhost kernel: Demotion targets for Node 0: null
Sep 30 20:15:06 localhost kernel: page_owner is disabled
Sep 30 20:15:06 localhost kernel: Key type .fscrypt registered
Sep 30 20:15:06 localhost kernel: Key type fscrypt-provisioning registered
Sep 30 20:15:06 localhost kernel: Key type big_key registered
Sep 30 20:15:06 localhost kernel: Key type encrypted registered
Sep 30 20:15:06 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Sep 30 20:15:06 localhost kernel: Loading compiled-in module X.509 certificates
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 20:15:06 localhost kernel: ima: Allocated hash algorithm: sha256
Sep 30 20:15:06 localhost kernel: ima: No architecture policies found
Sep 30 20:15:06 localhost kernel: evm: Initialising EVM extended attributes:
Sep 30 20:15:06 localhost kernel: evm: security.selinux
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64 (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64EXEC (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.SMACK64MMAP (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.apparmor (disabled)
Sep 30 20:15:06 localhost kernel: evm: security.ima
Sep 30 20:15:06 localhost kernel: evm: security.capability
Sep 30 20:15:06 localhost kernel: evm: HMAC attrs: 0x1
Sep 30 20:15:06 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Sep 30 20:15:06 localhost kernel: Running certificate verification RSA selftest
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Sep 30 20:15:06 localhost kernel: Running certificate verification ECDSA selftest
Sep 30 20:15:06 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Sep 30 20:15:06 localhost kernel: clk: Disabling unused clocks
Sep 30 20:15:06 localhost kernel: Freeing unused decrypted memory: 2028K
Sep 30 20:15:06 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Sep 30 20:15:06 localhost kernel: Write protecting the kernel read-only data: 30720k
Sep 30 20:15:06 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Sep 30 20:15:06 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Sep 30 20:15:06 localhost kernel: Run /init as init process
Sep 30 20:15:06 localhost kernel:   with arguments:
Sep 30 20:15:06 localhost kernel:     /init
Sep 30 20:15:06 localhost kernel:   with environment:
Sep 30 20:15:06 localhost kernel:     HOME=/
Sep 30 20:15:06 localhost kernel:     TERM=linux
Sep 30 20:15:06 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Sep 30 20:15:06 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 20:15:06 localhost systemd[1]: Detected virtualization kvm.
Sep 30 20:15:06 localhost systemd[1]: Detected architecture x86-64.
Sep 30 20:15:06 localhost systemd[1]: Running in initrd.
Sep 30 20:15:06 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Sep 30 20:15:06 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Sep 30 20:15:06 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Sep 30 20:15:06 localhost kernel: usb 1-1: Manufacturer: QEMU
Sep 30 20:15:06 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Sep 30 20:15:06 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Sep 30 20:15:06 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Sep 30 20:15:06 localhost systemd[1]: No hostname configured, using default hostname.
Sep 30 20:15:06 localhost systemd[1]: Hostname set to <localhost>.
Sep 30 20:15:06 localhost systemd[1]: Initializing machine ID from VM UUID.
Sep 30 20:15:06 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Sep 30 20:15:06 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:06 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 20:15:06 localhost systemd[1]: Reached target Initrd /usr File System.
Sep 30 20:15:06 localhost systemd[1]: Reached target Local File Systems.
Sep 30 20:15:06 localhost systemd[1]: Reached target Path Units.
Sep 30 20:15:06 localhost systemd[1]: Reached target Slice Units.
Sep 30 20:15:06 localhost systemd[1]: Reached target Swaps.
Sep 30 20:15:06 localhost systemd[1]: Reached target Timer Units.
Sep 30 20:15:06 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Sep 30 20:15:06 localhost systemd[1]: Listening on Journal Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 20:15:06 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 20:15:06 localhost systemd[1]: Reached target Socket Units.
Sep 30 20:15:06 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 20:15:06 localhost systemd[1]: Starting Journal Service...
Sep 30 20:15:06 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 20:15:06 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:15:06 localhost systemd[1]: Starting Create System Users...
Sep 30 20:15:06 localhost systemd[1]: Starting Setup Virtual Console...
Sep 30 20:15:06 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 20:15:06 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:15:06 localhost systemd-journald[310]: Journal started
Sep 30 20:15:06 localhost systemd-journald[310]: Runtime Journal (/run/log/journal/5322d6ed1aa0443a8ad6efe9323295c5) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:06 localhost systemd[1]: Started Journal Service.
Sep 30 20:15:06 localhost systemd-sysusers[313]: Creating group 'users' with GID 100.
Sep 30 20:15:06 localhost systemd-sysusers[313]: Creating group 'dbus' with GID 81.
Sep 30 20:15:06 localhost systemd-sysusers[313]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Sep 30 20:15:06 localhost systemd[1]: Finished Create System Users.
Sep 30 20:15:06 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 20:15:06 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 20:15:06 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 20:15:06 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 20:15:07 localhost systemd[1]: Finished Setup Virtual Console.
Sep 30 20:15:07 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Sep 30 20:15:07 localhost systemd[1]: Starting dracut cmdline hook...
Sep 30 20:15:07 localhost dracut-cmdline[329]: dracut-9 dracut-057-102.git20250818.el9
Sep 30 20:15:07 localhost dracut-cmdline[329]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 20:15:07 localhost systemd[1]: Finished dracut cmdline hook.
Sep 30 20:15:07 localhost systemd[1]: Starting dracut pre-udev hook...
Sep 30 20:15:07 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Sep 30 20:15:07 localhost kernel: device-mapper: uevent: version 1.0.3
Sep 30 20:15:07 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Sep 30 20:15:07 localhost kernel: RPC: Registered named UNIX socket transport module.
Sep 30 20:15:07 localhost kernel: RPC: Registered udp transport module.
Sep 30 20:15:07 localhost kernel: RPC: Registered tcp transport module.
Sep 30 20:15:07 localhost kernel: RPC: Registered tcp-with-tls transport module.
Sep 30 20:15:07 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 30 20:15:07 localhost rpc.statd[449]: Version 2.5.4 starting
Sep 30 20:15:07 localhost rpc.statd[449]: Initializing NSM state
Sep 30 20:15:07 localhost rpc.idmapd[454]: Setting log level to 0
Sep 30 20:15:07 localhost systemd[1]: Finished dracut pre-udev hook.
Sep 30 20:15:07 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 20:15:07 localhost systemd-udevd[467]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 20:15:07 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 20:15:07 localhost systemd[1]: Starting dracut pre-trigger hook...
Sep 30 20:15:07 localhost systemd[1]: Finished dracut pre-trigger hook.
Sep 30 20:15:07 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 20:15:07 localhost systemd[1]: Created slice Slice /system/modprobe.
Sep 30 20:15:07 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:07 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 20:15:07 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:07 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:07 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 20:15:07 localhost systemd[1]: Reached target Network.
Sep 30 20:15:07 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 20:15:07 localhost systemd[1]: Starting dracut initqueue hook...
Sep 30 20:15:07 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Sep 30 20:15:07 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Sep 30 20:15:07 localhost kernel:  vda: vda1
Sep 30 20:15:07 localhost kernel: libata version 3.00 loaded.
Sep 30 20:15:07 localhost systemd[1]: Mounting Kernel Configuration File System...
Sep 30 20:15:07 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Sep 30 20:15:07 localhost systemd-udevd[498]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:15:07 localhost kernel: scsi host0: ata_piix
Sep 30 20:15:07 localhost kernel: scsi host1: ata_piix
Sep 30 20:15:07 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Sep 30 20:15:07 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Sep 30 20:15:07 localhost systemd[1]: Mounted Kernel Configuration File System.
Sep 30 20:15:07 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 20:15:07 localhost systemd[1]: Reached target Initrd Root Device.
Sep 30 20:15:07 localhost systemd[1]: Reached target System Initialization.
Sep 30 20:15:07 localhost systemd[1]: Reached target Basic System.
Sep 30 20:15:07 localhost kernel: ata1: found unknown device (class 0)
Sep 30 20:15:07 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Sep 30 20:15:07 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Sep 30 20:15:07 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Sep 30 20:15:08 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Sep 30 20:15:08 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Sep 30 20:15:08 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Sep 30 20:15:08 localhost systemd[1]: Finished dracut initqueue hook.
Sep 30 20:15:08 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 20:15:08 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Sep 30 20:15:08 localhost systemd[1]: Reached target Remote File Systems.
Sep 30 20:15:08 localhost systemd[1]: Starting dracut pre-mount hook...
Sep 30 20:15:08 localhost systemd[1]: Finished dracut pre-mount hook.
Sep 30 20:15:08 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Sep 30 20:15:08 localhost systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Sep 30 20:15:08 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 20:15:08 localhost systemd[1]: Mounting /sysroot...
Sep 30 20:15:08 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Sep 30 20:15:08 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Sep 30 20:15:08 localhost kernel: XFS (vda1): Ending clean mount
Sep 30 20:15:08 localhost systemd[1]: Mounted /sysroot.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd Root File System.
Sep 30 20:15:08 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Sep 30 20:15:08 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd File Systems.
Sep 30 20:15:08 localhost systemd[1]: Reached target Initrd Default Target.
Sep 30 20:15:08 localhost systemd[1]: Starting dracut mount hook...
Sep 30 20:15:08 localhost systemd[1]: Finished dracut mount hook.
Sep 30 20:15:08 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Sep 30 20:15:08 localhost rpc.idmapd[454]: exiting on signal 15
Sep 30 20:15:08 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Sep 30 20:15:08 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Sep 30 20:15:08 localhost systemd[1]: Stopped target Network.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Sep 30 20:15:08 localhost systemd[1]: Stopped target Timer Units.
Sep 30 20:15:08 localhost systemd[1]: dbus.socket: Deactivated successfully.
Sep 30 20:15:08 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Sep 30 20:15:09 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Initrd Default Target.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Basic System.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Initrd Root Device.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Initrd /usr File System.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Path Units.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Remote File Systems.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Slice Units.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Socket Units.
Sep 30 20:15:09 localhost systemd[1]: Stopped target System Initialization.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Local File Systems.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Swaps.
Sep 30 20:15:09 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut mount hook.
Sep 30 20:15:09 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut pre-mount hook.
Sep 30 20:15:09 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Sep 30 20:15:09 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:09 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut initqueue hook.
Sep 30 20:15:09 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Apply Kernel Variables.
Sep 30 20:15:09 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Sep 30 20:15:09 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Coldplug All udev Devices.
Sep 30 20:15:09 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut pre-trigger hook.
Sep 30 20:15:09 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Sep 30 20:15:09 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Setup Virtual Console.
Sep 30 20:15:09 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Sep 30 20:15:09 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Sep 30 20:15:09 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Closed udev Control Socket.
Sep 30 20:15:09 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Closed udev Kernel Socket.
Sep 30 20:15:09 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut pre-udev hook.
Sep 30 20:15:09 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped dracut cmdline hook.
Sep 30 20:15:09 localhost systemd[1]: Starting Cleanup udev Database...
Sep 30 20:15:09 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Sep 30 20:15:09 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Sep 30 20:15:09 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Stopped Create System Users.
Sep 30 20:15:09 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Sep 30 20:15:09 localhost systemd[1]: Finished Cleanup udev Database.
Sep 30 20:15:09 localhost systemd[1]: Reached target Switch Root.
Sep 30 20:15:09 localhost systemd[1]: Starting Switch Root...
Sep 30 20:15:09 localhost systemd[1]: Switching root.
Sep 30 20:15:09 localhost systemd-journald[310]: Journal stopped
Sep 30 20:15:10 localhost systemd-journald[310]: Received SIGTERM from PID 1 (systemd).
Sep 30 20:15:10 localhost kernel: audit: type=1404 audit(1759263309.301:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability open_perms=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:15:10 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:15:10 localhost kernel: audit: type=1403 audit(1759263309.451:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Sep 30 20:15:10 localhost systemd[1]: Successfully loaded SELinux policy in 153.426ms.
Sep 30 20:15:10 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.505ms.
Sep 30 20:15:10 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 20:15:10 localhost systemd[1]: Detected virtualization kvm.
Sep 30 20:15:10 localhost systemd[1]: Detected architecture x86-64.
Sep 30 20:15:10 localhost systemd-rc-local-generator[640]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:15:10 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Stopped Switch Root.
Sep 30 20:15:10 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Sep 30 20:15:10 localhost systemd[1]: Created slice Slice /system/getty.
Sep 30 20:15:10 localhost systemd[1]: Created slice Slice /system/serial-getty.
Sep 30 20:15:10 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Sep 30 20:15:10 localhost systemd[1]: Created slice User and Session Slice.
Sep 30 20:15:10 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 20:15:10 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Sep 30 20:15:10 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Sep 30 20:15:10 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 20:15:10 localhost systemd[1]: Stopped target Switch Root.
Sep 30 20:15:10 localhost systemd[1]: Stopped target Initrd File Systems.
Sep 30 20:15:10 localhost systemd[1]: Stopped target Initrd Root File System.
Sep 30 20:15:10 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Sep 30 20:15:10 localhost systemd[1]: Reached target Path Units.
Sep 30 20:15:10 localhost systemd[1]: Reached target rpc_pipefs.target.
Sep 30 20:15:10 localhost systemd[1]: Reached target Slice Units.
Sep 30 20:15:10 localhost systemd[1]: Reached target Swaps.
Sep 30 20:15:10 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Sep 30 20:15:10 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Sep 30 20:15:10 localhost systemd[1]: Reached target RPC Port Mapper.
Sep 30 20:15:10 localhost systemd[1]: Listening on Process Core Dump Socket.
Sep 30 20:15:10 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Sep 30 20:15:10 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 20:15:10 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 20:15:10 localhost systemd[1]: Mounting Huge Pages File System...
Sep 30 20:15:10 localhost systemd[1]: Mounting POSIX Message Queue File System...
Sep 30 20:15:10 localhost systemd[1]: Mounting Kernel Debug File System...
Sep 30 20:15:10 localhost systemd[1]: Mounting Kernel Trace File System...
Sep 30 20:15:10 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 20:15:10 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module drm...
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module fuse...
Sep 30 20:15:10 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Sep 30 20:15:10 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Stopped File System Check on Root Device.
Sep 30 20:15:10 localhost systemd[1]: Stopped Journal Service.
Sep 30 20:15:10 localhost systemd[1]: Starting Journal Service...
Sep 30 20:15:10 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Starting Generate network units from Kernel command line...
Sep 30 20:15:10 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:10 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Sep 30 20:15:10 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:15:10 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 20:15:10 localhost systemd[1]: Mounted Huge Pages File System.
Sep 30 20:15:10 localhost kernel: fuse: init (API version 7.37)
Sep 30 20:15:10 localhost systemd[1]: Mounted POSIX Message Queue File System.
Sep 30 20:15:10 localhost systemd[1]: Mounted Kernel Debug File System.
Sep 30 20:15:10 localhost systemd[1]: Mounted Kernel Trace File System.
Sep 30 20:15:10 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 20:15:10 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:10 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Sep 30 20:15:10 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Sep 30 20:15:10 localhost systemd-journald[681]: Journal started
Sep 30 20:15:10 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:09 localhost systemd[1]: Queued start job for default target Multi-User System.
Sep 30 20:15:09 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Started Journal Service.
Sep 30 20:15:10 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module fuse.
Sep 30 20:15:10 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Sep 30 20:15:10 localhost systemd[1]: Finished Generate network units from Kernel command line.
Sep 30 20:15:10 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Sep 30 20:15:10 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:15:10 localhost kernel: ACPI: bus type drm_connector registered
Sep 30 20:15:10 localhost systemd[1]: Mounting FUSE Control File System...
Sep 30 20:15:10 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 20:15:10 localhost systemd[1]: Starting Rebuild Hardware Database...
Sep 30 20:15:10 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Sep 30 20:15:10 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Sep 30 20:15:10 localhost systemd[1]: Starting Load/Save OS Random Seed...
Sep 30 20:15:10 localhost systemd[1]: Starting Create System Users...
Sep 30 20:15:10 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module drm.
Sep 30 20:15:10 localhost systemd[1]: Mounted FUSE Control File System.
Sep 30 20:15:10 localhost systemd-journald[681]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 20:15:10 localhost systemd-journald[681]: Received client request to flush runtime journal.
Sep 30 20:15:10 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Sep 30 20:15:10 localhost systemd[1]: Finished Load/Save OS Random Seed.
Sep 30 20:15:10 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 20:15:10 localhost systemd[1]: Finished Create System Users.
Sep 30 20:15:10 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 20:15:10 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 20:15:10 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 20:15:10 localhost systemd[1]: Reached target Preparation for Local File Systems.
Sep 30 20:15:10 localhost systemd[1]: Reached target Local File Systems.
Sep 30 20:15:10 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Sep 30 20:15:10 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Sep 30 20:15:10 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Starting Automatic Boot Loader Update...
Sep 30 20:15:10 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Sep 30 20:15:10 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 20:15:10 localhost bootctl[699]: Couldn't find EFI system partition, skipping.
Sep 30 20:15:10 localhost systemd[1]: Finished Automatic Boot Loader Update.
Sep 30 20:15:10 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 20:15:10 localhost systemd[1]: Starting Security Auditing Service...
Sep 30 20:15:10 localhost systemd[1]: Starting RPC Bind...
Sep 30 20:15:10 localhost systemd[1]: Starting Rebuild Journal Catalog...
Sep 30 20:15:10 localhost auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Sep 30 20:15:10 localhost auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Journal Catalog.
Sep 30 20:15:10 localhost systemd[1]: Started RPC Bind.
Sep 30 20:15:10 localhost augenrules[710]: /sbin/augenrules: No change
Sep 30 20:15:10 localhost augenrules[725]: No rules
Sep 30 20:15:10 localhost augenrules[725]: enabled 1
Sep 30 20:15:10 localhost augenrules[725]: failure 1
Sep 30 20:15:10 localhost augenrules[725]: pid 705
Sep 30 20:15:10 localhost augenrules[725]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[725]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[725]: lost 0
Sep 30 20:15:10 localhost augenrules[725]: backlog 4
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost augenrules[725]: enabled 1
Sep 30 20:15:10 localhost augenrules[725]: failure 1
Sep 30 20:15:10 localhost augenrules[725]: pid 705
Sep 30 20:15:10 localhost augenrules[725]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[725]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[725]: lost 0
Sep 30 20:15:10 localhost augenrules[725]: backlog 4
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost augenrules[725]: enabled 1
Sep 30 20:15:10 localhost augenrules[725]: failure 1
Sep 30 20:15:10 localhost augenrules[725]: pid 705
Sep 30 20:15:10 localhost augenrules[725]: rate_limit 0
Sep 30 20:15:10 localhost augenrules[725]: backlog_limit 8192
Sep 30 20:15:10 localhost augenrules[725]: lost 0
Sep 30 20:15:10 localhost augenrules[725]: backlog 0
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time 60000
Sep 30 20:15:10 localhost augenrules[725]: backlog_wait_time_actual 0
Sep 30 20:15:10 localhost systemd[1]: Started Security Auditing Service.
Sep 30 20:15:10 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Sep 30 20:15:10 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Hardware Database.
Sep 30 20:15:10 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 20:15:10 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Sep 30 20:15:10 localhost systemd[1]: Starting Update is Completed...
Sep 30 20:15:10 localhost systemd[1]: Finished Update is Completed.
Sep 30 20:15:10 localhost systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 20:15:10 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 20:15:10 localhost systemd[1]: Reached target System Initialization.
Sep 30 20:15:10 localhost systemd[1]: Started dnf makecache --timer.
Sep 30 20:15:10 localhost systemd[1]: Started Daily rotation of log files.
Sep 30 20:15:10 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Sep 30 20:15:10 localhost systemd[1]: Reached target Timer Units.
Sep 30 20:15:10 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 20:15:10 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Sep 30 20:15:10 localhost systemd[1]: Reached target Socket Units.
Sep 30 20:15:10 localhost systemd[1]: Starting D-Bus System Message Bus...
Sep 30 20:15:10 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:10 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Sep 30 20:15:10 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 20:15:10 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 20:15:10 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 20:15:10 localhost systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:15:10 localhost systemd[1]: Started D-Bus System Message Bus.
Sep 30 20:15:10 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Sep 30 20:15:10 localhost systemd[1]: Reached target Basic System.
Sep 30 20:15:10 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Sep 30 20:15:10 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Sep 30 20:15:10 localhost dbus-broker-lau[760]: Ready
Sep 30 20:15:10 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Sep 30 20:15:10 localhost systemd[1]: Starting NTP client/server...
Sep 30 20:15:10 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Sep 30 20:15:10 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Sep 30 20:15:10 localhost systemd[1]: Starting IPv4 firewall with iptables...
Sep 30 20:15:10 localhost systemd[1]: Started irqbalance daemon.
Sep 30 20:15:10 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 20:15:10 localhost systemd[1]: Reached target sshd-keygen.target.
Sep 30 20:15:10 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Sep 30 20:15:10 localhost systemd[1]: Reached target User and Group Name Lookups.
Sep 30 20:15:11 localhost systemd[1]: Starting User Login Management...
Sep 30 20:15:11 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Sep 30 20:15:11 localhost chronyd[798]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 20:15:11 localhost chronyd[798]: Loaded 0 symmetric keys
Sep 30 20:15:11 localhost chronyd[798]: Using right/UTC timezone to obtain leap second data
Sep 30 20:15:11 localhost chronyd[798]: Loaded seccomp filter (level 2)
Sep 30 20:15:11 localhost systemd[1]: Started NTP client/server.
Sep 30 20:15:11 localhost systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 20:15:11 localhost systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 20:15:11 localhost systemd-logind[792]: New seat seat0.
Sep 30 20:15:11 localhost systemd[1]: Started User Login Management.
Sep 30 20:15:11 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Sep 30 20:15:11 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Sep 30 20:15:11 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Sep 30 20:15:11 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Sep 30 20:15:11 localhost kernel: Console: switching to colour dummy device 80x25
Sep 30 20:15:11 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Sep 30 20:15:11 localhost kernel: [drm] features: -context_init
Sep 30 20:15:11 localhost kernel: [drm] number of scanouts: 1
Sep 30 20:15:11 localhost kernel: [drm] number of cap sets: 0
Sep 30 20:15:11 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Sep 30 20:15:11 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Sep 30 20:15:11 localhost kernel: Console: switching to colour frame buffer device 128x48
Sep 30 20:15:11 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Sep 30 20:15:11 localhost kernel: kvm_amd: TSC scaling supported
Sep 30 20:15:11 localhost kernel: kvm_amd: Nested Virtualization enabled
Sep 30 20:15:11 localhost kernel: kvm_amd: Nested Paging enabled
Sep 30 20:15:11 localhost kernel: kvm_amd: LBR virtualization supported
Sep 30 20:15:11 localhost iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Sep 30 20:15:11 localhost systemd[1]: Finished IPv4 firewall with iptables.
Sep 30 20:15:11 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 30 Sep 2025 20:15:11 +0000. Up 7.32 seconds.
Sep 30 20:15:11 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Sep 30 20:15:11 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Sep 30 20:15:11 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp9newxha6.mount: Deactivated successfully.
Sep 30 20:15:11 localhost systemd[1]: Starting Hostname Service...
Sep 30 20:15:12 localhost systemd[1]: Started Hostname Service.
Sep 30 20:15:12 np0005463580.novalocal systemd-hostnamed[856]: Hostname set to <np0005463580.novalocal> (static)
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Reached target Preparation for Network.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Starting Network Manager...
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.2900] NetworkManager (version 1.54.1-1.el9) is starting... (boot:94c9591a-3f00-4303-bfc6-d623b91048fe)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.2907] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3144] manager[0x557d7deb2080]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3221] hostname: hostname: using hostnamed
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3222] hostname: static hostname changed from (none) to "np0005463580.novalocal"
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3227] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3342] manager[0x557d7deb2080]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3343] manager[0x557d7deb2080]: rfkill: WWAN hardware radio set enabled
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3430] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3431] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3432] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3432] manager: Networking is enabled by state file
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3433] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3468] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3492] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3519] dhcp: init: Using DHCP client 'internal'
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3522] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3537] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3552] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3559] device (lo): Activation: starting connection 'lo' (81f1075e-2cab-4304-8039-837dd8b444d5)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3567] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3570] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3596] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3600] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3602] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3603] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3604] device (eth0): carrier: link connected
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3606] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3611] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3617] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3620] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3621] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3622] manager: NetworkManager state is now CONNECTING
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3623] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3628] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3630] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Started Network Manager.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Reached target Network.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3838] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3840] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.3849] device (lo): Activation: successful, device activated.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Reached target NFS client services.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Reached target Remote File Systems.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8689] dhcp4 (eth0): state changed new lease, address=38.102.83.69
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8709] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8740] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8787] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8788] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8791] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8794] device (eth0): Activation: successful, device activated.
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8799] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:15:12 np0005463580.novalocal NetworkManager[860]: <info>  [1759263312.8802] manager: startup complete
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:15:12 np0005463580.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 30 Sep 2025 20:15:13 +0000. Up 8.89 seconds.
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |  eth0  | True |         38.102.83.69         | 255.255.255.0 | global | fa:16:3e:09:b1:22 |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |  eth0  | True | fe80::f816:3eff:fe09:b122/64 |       .       |  link  | fa:16:3e:09:b1:22 |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Sep 30 20:15:13 np0005463580.novalocal cloud-init[924]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: new group: name=cloud-user, GID=1001
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: add 'cloud-user' to group 'adm'
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: add 'cloud-user' to group 'systemd-journal'
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: add 'cloud-user' to shadow group 'adm'
Sep 30 20:15:14 np0005463580.novalocal useradd[991]: add 'cloud-user' to shadow group 'systemd-journal'
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Generating public/private rsa key pair.
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key fingerprint is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: SHA256:0L0FfXLZssQQnFMrQ+6t6JIhRvf052WynVuUJSu44fI root@np0005463580.novalocal
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key's randomart image is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +---[RSA 3072]----+
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |          .o+*.o |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |       . . +* B..|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |      . . . =*oo.|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |      ... .= +.oo|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     . .So+.o o..|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |      o ...+.oo.o|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     . ..o+ .o *o|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |        o+    o.o|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |         .E    ..|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +----[SHA256]-----+
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Generating public/private ecdsa key pair.
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key fingerprint is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: SHA256:zu5OgI2SRXwd1dyGVQCGBUZ3ST8me7aUFVdmG6zpoUA root@np0005463580.novalocal
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key's randomart image is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +---[ECDSA 256]---+
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |   ..  ..+=*=***B|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |   .. . .E.o+.+==|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |    ..  .    oo+o|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |   o +   .   ++ +|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |  o o o S . o..= |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |   .   +   . .+ .|
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |        +      . |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |       o         |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |       o+        |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +----[SHA256]-----+
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Generating public/private ed25519 key pair.
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key fingerprint is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: SHA256:2xTJjYMaKVtGbTKcGbXzK5kD62xxUQ+TnZMCXaKc3tI root@np0005463580.novalocal
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: The key's randomart image is:
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +--[ED25519 256]--+
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     .o*oo.+.o   |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     .*oo*B==    |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |    . =+B.*=..   |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     = +.= o.    |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |    . o S.E      |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |      .o.B .     |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |      .o* o      |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     o.  o       |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: |     .o          |
Sep 30 20:15:14 np0005463580.novalocal cloud-init[924]: +----[SHA256]-----+
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Reached target Cloud-config availability.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Reached target Network is Online.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting System Logging Service...
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting OpenSSH server daemon...
Sep 30 20:15:14 np0005463580.novalocal sm-notify[1006]: Version 2.5.4 starting
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting Permit User Sessions...
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started Notify NFS peers of a restart.
Sep 30 20:15:14 np0005463580.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Sep 30 20:15:14 np0005463580.novalocal sshd[1008]: Server listening on :: port 22.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started OpenSSH server daemon.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Finished Permit User Sessions.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started Command Scheduler.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started Getty on tty1.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started Serial Getty on ttyS0.
Sep 30 20:15:14 np0005463580.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Sep 30 20:15:14 np0005463580.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Sep 30 20:15:14 np0005463580.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 87% if used.)
Sep 30 20:15:14 np0005463580.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Reached target Login Prompts.
Sep 30 20:15:14 np0005463580.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Sep 30 20:15:14 np0005463580.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Started System Logging Service.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Reached target Multi-User System.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Sep 30 20:15:14 np0005463580.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Sep 30 20:15:14 np0005463580.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1020]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 30 Sep 2025 20:15:15 +0000. Up 10.81 seconds.
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1019]: Connection reset by 38.102.83.114 port 59966 [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1022]: Unable to negotiate with 38.102.83.114 port 59980: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 20:15:15 np0005463580.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Sep 30 20:15:15 np0005463580.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1027]: Unable to negotiate with 38.102.83.114 port 59984: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1029]: Unable to negotiate with 38.102.83.114 port 59986: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1031]: Connection closed by 38.102.83.114 port 59992 [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1024]: Connection closed by 38.102.83.114 port 59982 [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1035]: Unable to negotiate with 38.102.83.114 port 60002: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1037]: Unable to negotiate with 38.102.83.114 port 60018: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 20:15:15 np0005463580.novalocal sshd-session[1033]: Connection closed by 38.102.83.114 port 59996 [preauth]
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1041]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 30 Sep 2025 20:15:15 +0000. Up 11.21 seconds.
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1043]: #############################################################
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1044]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1046]: 256 SHA256:zu5OgI2SRXwd1dyGVQCGBUZ3ST8me7aUFVdmG6zpoUA root@np0005463580.novalocal (ECDSA)
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1048]: 256 SHA256:2xTJjYMaKVtGbTKcGbXzK5kD62xxUQ+TnZMCXaKc3tI root@np0005463580.novalocal (ED25519)
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1050]: 3072 SHA256:0L0FfXLZssQQnFMrQ+6t6JIhRvf052WynVuUJSu44fI root@np0005463580.novalocal (RSA)
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1051]: -----END SSH HOST KEY FINGERPRINTS-----
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1052]: #############################################################
Sep 30 20:15:15 np0005463580.novalocal cloud-init[1041]: Cloud-init v. 24.4-7.el9 finished at Tue, 30 Sep 2025 20:15:15 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.42 seconds
Sep 30 20:15:15 np0005463580.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Sep 30 20:15:15 np0005463580.novalocal systemd[1]: Reached target Cloud-init target.
Sep 30 20:15:15 np0005463580.novalocal systemd[1]: Startup finished in 1.573s (kernel) + 3.404s (initrd) + 6.527s (userspace) = 11.505s.
Sep 30 20:15:17 np0005463580.novalocal chronyd[798]: Selected source 162.159.200.1 (2.centos.pool.ntp.org)
Sep 30 20:15:17 np0005463580.novalocal chronyd[798]: System clock TAI offset set to 37 seconds
Sep 30 20:15:19 np0005463580.novalocal chronyd[798]: Selected source 50.43.156.177 (2.centos.pool.ntp.org)
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 25 affinity is now unmanaged
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 31 affinity is now unmanaged
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 28 affinity is now unmanaged
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 32 affinity is now unmanaged
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 30 affinity is now unmanaged
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Sep 30 20:15:21 np0005463580.novalocal irqbalance[784]: IRQ 29 affinity is now unmanaged
Sep 30 20:15:22 np0005463580.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:15:41 np0005463580.novalocal sshd-session[1056]: Accepted publickey for zuul from 38.102.83.114 port 41080 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Created slice User Slice of UID 1000.
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Sep 30 20:15:41 np0005463580.novalocal systemd-logind[792]: New session 1 of user zuul.
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Starting User Manager for UID 1000...
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Queued start job for default target Main User Target.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Created slice User Application Slice.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Reached target Paths.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Reached target Timers.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Starting D-Bus User Message Bus Socket...
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Starting Create User's Volatile Files and Directories...
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Listening on D-Bus User Message Bus Socket.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Reached target Sockets.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Finished Create User's Volatile Files and Directories.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Reached target Basic System.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Reached target Main User Target.
Sep 30 20:15:41 np0005463580.novalocal systemd[1060]: Startup finished in 161ms.
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Started User Manager for UID 1000.
Sep 30 20:15:41 np0005463580.novalocal systemd[1]: Started Session 1 of User zuul.
Sep 30 20:15:41 np0005463580.novalocal sshd-session[1056]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:15:42 np0005463580.novalocal python3[1142]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:42 np0005463580.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:15:44 np0005463580.novalocal python3[1172]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:53 np0005463580.novalocal python3[1230]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:15:54 np0005463580.novalocal python3[1270]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Sep 30 20:15:56 np0005463580.novalocal python3[1296]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCzZ/XL27VGgCAJBiqlI6HKkfnRsk1V+FDeDk6XlKIPmzzSPxdfYbN8SC3V4Szi81J+zjdz4In1dc3Hx+bw/GZY+bYflSKUGZLyYdkaicR/Y3qYu3tj4Nt71877eD1Tim1HczV9VrNXM2AjNdeTYpQfp80ysnEqPHeMdwWGq2tFVRcOzH7JYXxSqZDmklwzzrohmXY5x2oEcELxovEfbdEGP0NpGKVEKSW8ITqvpBXJ1dg5OAgwT5JcxE2tGjt91ndS681iepQpWzRNkEDvLhWMSB8YqJIf03rzB9tOSiWnPvuBxvq1mNiHUpIquweTpseBqrt5m5kH371FTgjx086ATpjRmXxmMapRtjOtqIOzMfjuiG0R5W4Y+WFnrvWupXF3THo6Tio85Fxqu6JV/IhfmF0dq9gJSsAeOPPx+Th7fExTwDPRz3zIKNDocBZJAskdrkMYEhr4/6qaKlASm1WPxRQ4aJKGpWZFLWdRVz9MjDwfLt/dYpYQS0a+AeTQ5HU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:15:56 np0005463580.novalocal python3[1320]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:15:57 np0005463580.novalocal python3[1419]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:15:57 np0005463580.novalocal python3[1490]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263356.7486491-251-203747569080766/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7f1af54a767e4de98c60b8f1f39de2f1_id_rsa follow=False checksum=50656a9e54d1b7e57b54f7402d0dd79ba3dcaae0 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:15:58 np0005463580.novalocal python3[1613]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:15:58 np0005463580.novalocal python3[1684]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263357.8222117-306-78566976153994/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7f1af54a767e4de98c60b8f1f39de2f1_id_rsa.pub follow=False checksum=9ff1c1ce825bc16e71c494cc60ffaad2f27c83d2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:00 np0005463580.novalocal python3[1733]: ansible-ping Invoked with data=pong
Sep 30 20:16:01 np0005463580.novalocal python3[1757]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:16:03 np0005463580.novalocal python3[1815]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Sep 30 20:16:04 np0005463580.novalocal python3[1847]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:04 np0005463580.novalocal python3[1871]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:04 np0005463580.novalocal python3[1895]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463580.novalocal python3[1919]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463580.novalocal python3[1943]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:05 np0005463580.novalocal python3[1967]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:07 np0005463580.novalocal sudo[1991]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgrggqborowsbbrwjnzrctfqvljbhzoj ; /usr/bin/python3'
Sep 30 20:16:07 np0005463580.novalocal sudo[1991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:07 np0005463580.novalocal python3[1993]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:07 np0005463580.novalocal sudo[1991]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:08 np0005463580.novalocal sudo[2069]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpohkjfuxykydqbwvhrkfbkempmikclq ; /usr/bin/python3'
Sep 30 20:16:08 np0005463580.novalocal sudo[2069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:08 np0005463580.novalocal python3[2071]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:08 np0005463580.novalocal sudo[2069]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:08 np0005463580.novalocal sudo[2142]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doflhivofyatyeoptipfagcioqbkcent ; /usr/bin/python3'
Sep 30 20:16:08 np0005463580.novalocal sudo[2142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:08 np0005463580.novalocal python3[2144]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263367.7772036-31-60352263865983/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:08 np0005463580.novalocal sudo[2142]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:09 np0005463580.novalocal python3[2192]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:09 np0005463580.novalocal python3[2216]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:09 np0005463580.novalocal python3[2240]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463580.novalocal python3[2264]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463580.novalocal python3[2288]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:10 np0005463580.novalocal python3[2312]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463580.novalocal python3[2336]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463580.novalocal python3[2360]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463580.novalocal python3[2384]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:11 np0005463580.novalocal python3[2408]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463580.novalocal python3[2432]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463580.novalocal python3[2456]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:12 np0005463580.novalocal python3[2480]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463580.novalocal python3[2504]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463580.novalocal python3[2528]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:13 np0005463580.novalocal python3[2552]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463580.novalocal python3[2576]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463580.novalocal python3[2600]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:14 np0005463580.novalocal python3[2624]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463580.novalocal python3[2648]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463580.novalocal python3[2672]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463580.novalocal python3[2696]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:15 np0005463580.novalocal python3[2720]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463580.novalocal python3[2744]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463580.novalocal python3[2768]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:16 np0005463580.novalocal python3[2792]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:16:18 np0005463580.novalocal sudo[2816]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblhvdexwhlhlqoupyaftpmhvojflbda ; /usr/bin/python3'
Sep 30 20:16:18 np0005463580.novalocal sudo[2816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:19 np0005463580.novalocal python3[2818]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 20:16:19 np0005463580.novalocal systemd[1]: Starting Time & Date Service...
Sep 30 20:16:19 np0005463580.novalocal systemd[1]: Started Time & Date Service.
Sep 30 20:16:19 np0005463580.novalocal systemd-timedated[2820]: Changed time zone to 'UTC' (UTC).
Sep 30 20:16:19 np0005463580.novalocal sudo[2816]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:20 np0005463580.novalocal sudo[2847]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqhelalbacepqecrxlfcjkmlrtbresfj ; /usr/bin/python3'
Sep 30 20:16:20 np0005463580.novalocal sudo[2847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:20 np0005463580.novalocal python3[2849]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:20 np0005463580.novalocal sudo[2847]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:21 np0005463580.novalocal python3[2925]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:21 np0005463580.novalocal python3[2996]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759263380.936036-251-34546646726957/source _original_basename=tmp3ozhfkwy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:22 np0005463580.novalocal python3[3096]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:22 np0005463580.novalocal python3[3167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759263381.9500449-301-177699916328362/source _original_basename=tmpkr3tmsug follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:23 np0005463580.novalocal sudo[3267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyfeftxtvoriulrhgprikstpicxkfqak ; /usr/bin/python3'
Sep 30 20:16:23 np0005463580.novalocal sudo[3267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:23 np0005463580.novalocal python3[3269]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:23 np0005463580.novalocal sudo[3267]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:23 np0005463580.novalocal sudo[3340]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzhzdapaimdktennrnciirkaqcpmefmn ; /usr/bin/python3'
Sep 30 20:16:23 np0005463580.novalocal sudo[3340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:23 np0005463580.novalocal python3[3342]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759263383.2736256-381-8260858005228/source _original_basename=tmpa9ekqudu follow=False checksum=e65bd606f60198d92f186a6106bc29e41af0cb65 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:24 np0005463580.novalocal sudo[3340]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:24 np0005463580.novalocal python3[3390]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:24 np0005463580.novalocal python3[3416]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:25 np0005463580.novalocal sudo[3494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vthsytptxlmnosnwskbwydfxrkfplyth ; /usr/bin/python3'
Sep 30 20:16:25 np0005463580.novalocal sudo[3494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:25 np0005463580.novalocal python3[3496]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:16:25 np0005463580.novalocal sudo[3494]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:25 np0005463580.novalocal sudo[3567]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffsbdhomtvdjwtltzkokrfxpwnlftxax ; /usr/bin/python3'
Sep 30 20:16:25 np0005463580.novalocal sudo[3567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:25 np0005463580.novalocal python3[3569]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263385.1756167-451-185537851309875/source _original_basename=tmpn6azwui1 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:25 np0005463580.novalocal sudo[3567]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:26 np0005463580.novalocal sudo[3618]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yugmzwzdvwzqvtlhlwrtmljoalyhdrnv ; /usr/bin/python3'
Sep 30 20:16:26 np0005463580.novalocal sudo[3618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:26 np0005463580.novalocal python3[3620]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-48f2-6e73-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:16:26 np0005463580.novalocal sudo[3618]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:27 np0005463580.novalocal python3[3648]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-48f2-6e73-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Sep 30 20:16:28 np0005463580.novalocal python3[3676]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:46 np0005463580.novalocal sudo[3700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtvwpchcrgkkrkqdialteuksjjiridty ; /usr/bin/python3'
Sep 30 20:16:46 np0005463580.novalocal sudo[3700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:16:46 np0005463580.novalocal python3[3702]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:16:46 np0005463580.novalocal sudo[3700]: pam_unix(sudo:session): session closed for user root
Sep 30 20:16:49 np0005463580.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Sep 30 20:17:26 np0005463580.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Sep 30 20:17:26 np0005463580.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8406] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:17:26 np0005463580.novalocal systemd-udevd[3706]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8669] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8724] settings: (eth1): created default wired connection 'Wired connection 1'
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8736] device (eth1): carrier: link connected
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8741] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8755] policy: auto-activating connection 'Wired connection 1' (f3f92eec-e382-3400-9db1-93c4109b9992)
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8763] device (eth1): Activation: starting connection 'Wired connection 1' (f3f92eec-e382-3400-9db1-93c4109b9992)
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8767] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8777] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8788] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:17:26 np0005463580.novalocal NetworkManager[860]: <info>  [1759263446.8798] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:27 np0005463580.novalocal python3[3732]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-ae5f-fbcf-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:17:34 np0005463580.novalocal sudo[3810]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nstmgjuuggeinwrartkdjmrmyrlforew ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:34 np0005463580.novalocal sudo[3810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:34 np0005463580.novalocal python3[3812]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:17:34 np0005463580.novalocal sudo[3810]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:35 np0005463580.novalocal sudo[3883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drrkizsyquznrjkeyvcgfencpfcayniq ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:35 np0005463580.novalocal sudo[3883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:35 np0005463580.novalocal python3[3885]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759263454.4538517-104-60492933614534/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=13b43c58d475ce8581828d7e433f941286ba15a8 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:17:35 np0005463580.novalocal sudo[3883]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:35 np0005463580.novalocal sudo[3933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iimuxhkslqdvfxjionuzplkyufnwdcvs ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:17:35 np0005463580.novalocal sudo[3933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:17:35 np0005463580.novalocal python3[3935]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:17:35 np0005463580.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 20:17:35 np0005463580.novalocal systemd[1]: Stopped Network Manager Wait Online.
Sep 30 20:17:35 np0005463580.novalocal systemd[1]: Stopping Network Manager Wait Online...
Sep 30 20:17:35 np0005463580.novalocal systemd[1]: Stopping Network Manager...
Sep 30 20:17:35 np0005463580.novalocal NetworkManager[860]: <info>  [1759263455.9913] caught SIGTERM, shutting down normally.
Sep 30 20:17:35 np0005463580.novalocal NetworkManager[860]: <info>  [1759263455.9923] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:17:35 np0005463580.novalocal NetworkManager[860]: <info>  [1759263455.9923] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:35 np0005463580.novalocal NetworkManager[860]: <info>  [1759263455.9924] dhcp4 (eth0): state changed no lease
Sep 30 20:17:35 np0005463580.novalocal NetworkManager[860]: <info>  [1759263455.9927] manager: NetworkManager state is now CONNECTING
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[860]: <info>  [1759263456.0016] dhcp4 (eth1): canceled DHCP transaction
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[860]: <info>  [1759263456.0016] dhcp4 (eth1): state changed no lease
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[860]: <info>  [1759263456.0064] exiting (success)
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Stopped Network Manager.
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: NetworkManager.service: Consumed 1.280s CPU time, 9.9M memory peak.
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Starting Network Manager...
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.0580] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:94c9591a-3f00-4303-bfc6-d623b91048fe)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.0583] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.0644] manager[0x558813235070]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Starting Hostname Service...
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Started Hostname Service.
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1723] hostname: hostname: using hostnamed
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1725] hostname: static hostname changed from (none) to "np0005463580.novalocal"
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1730] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1737] manager[0x558813235070]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1738] manager[0x558813235070]: rfkill: WWAN hardware radio set enabled
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1770] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1770] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1770] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1771] manager: Networking is enabled by state file
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1773] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1778] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1800] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1807] dhcp: init: Using DHCP client 'internal'
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1809] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1813] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1817] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1824] device (lo): Activation: starting connection 'lo' (81f1075e-2cab-4304-8039-837dd8b444d5)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1829] device (eth0): carrier: link connected
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1832] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1835] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1835] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1840] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1844] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1848] device (eth1): carrier: link connected
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1851] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1854] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (f3f92eec-e382-3400-9db1-93c4109b9992) (indicated)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1854] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1858] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1863] device (eth1): Activation: starting connection 'Wired connection 1' (f3f92eec-e382-3400-9db1-93c4109b9992)
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Started Network Manager.
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1868] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1872] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1884] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1887] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1889] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1892] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1894] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1896] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1899] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1906] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1908] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1916] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1918] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1945] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1948] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:17:36 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263456.1953] device (lo): Activation: successful, device activated.
Sep 30 20:17:36 np0005463580.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:17:36 np0005463580.novalocal sudo[3933]: pam_unix(sudo:session): session closed for user root
Sep 30 20:17:36 np0005463580.novalocal python3[4000]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-ae5f-fbcf-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7336] dhcp4 (eth0): state changed new lease, address=38.102.83.69
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7346] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7411] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7450] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7451] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7455] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7465] device (eth0): Activation: successful, device activated.
Sep 30 20:17:37 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263457.7470] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:17:47 np0005463580.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:17:54 np0005463580.novalocal sshd-session[4022]: Invalid user 0 from 185.217.1.246 port 22238
Sep 30 20:17:54 np0005463580.novalocal sshd-session[4022]: Disconnecting invalid user 0 185.217.1.246 port 22238: Change of username or service not allowed: (0,ssh-connection) -> (user13,ssh-connection) [preauth]
Sep 30 20:17:56 np0005463580.novalocal sshd-session[4024]: Invalid user user13 from 185.217.1.246 port 34695
Sep 30 20:17:57 np0005463580.novalocal sshd-session[4024]: Disconnecting invalid user user13 185.217.1.246 port 34695: Change of username or service not allowed: (user13,ssh-connection) -> (useradmin,ssh-connection) [preauth]
Sep 30 20:17:59 np0005463580.novalocal sshd-session[4026]: Invalid user useradmin from 185.217.1.246 port 57913
Sep 30 20:18:00 np0005463580.novalocal sshd-session[4026]: Disconnecting invalid user useradmin 185.217.1.246 port 57913: Change of username or service not allowed: (useradmin,ssh-connection) -> (bnb,ssh-connection) [preauth]
Sep 30 20:18:01 np0005463580.novalocal systemd[1060]: Starting Mark boot as successful...
Sep 30 20:18:01 np0005463580.novalocal systemd[1060]: Finished Mark boot as successful.
Sep 30 20:18:01 np0005463580.novalocal sshd-session[4028]: Invalid user bnb from 185.217.1.246 port 12658
Sep 30 20:18:02 np0005463580.novalocal sshd-session[4028]: Disconnecting invalid user bnb 185.217.1.246 port 12658: Change of username or service not allowed: (bnb,ssh-connection) -> (ubuntu,ssh-connection) [preauth]
Sep 30 20:18:06 np0005463580.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:18:07 np0005463580.novalocal sshd-session[4031]: Invalid user ubuntu from 185.217.1.246 port 28998
Sep 30 20:18:10 np0005463580.novalocal sshd-session[4031]: error: maximum authentication attempts exceeded for invalid user ubuntu from 185.217.1.246 port 28998 ssh2 [preauth]
Sep 30 20:18:10 np0005463580.novalocal sshd-session[4031]: Disconnecting invalid user ubuntu 185.217.1.246 port 28998: Too many authentication failures [preauth]
Sep 30 20:18:17 np0005463580.novalocal sshd-session[4035]: Invalid user ubuntu from 185.217.1.246 port 46251
Sep 30 20:18:17 np0005463580.novalocal sshd-session[4035]: Disconnecting invalid user ubuntu 185.217.1.246 port 46251: Change of username or service not allowed: (ubuntu,ssh-connection) -> (rob,ssh-connection) [preauth]
Sep 30 20:18:20 np0005463580.novalocal sshd-session[4037]: Invalid user rob from 185.217.1.246 port 23053
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3139] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:18:21 np0005463580.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:18:21 np0005463580.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3513] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3516] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3524] device (eth1): Activation: successful, device activated.
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3531] manager: startup complete
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3534] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <warn>  [1759263501.3540] device (eth1): Activation: failed for connection 'Wired connection 1'
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3547] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3736] dhcp4 (eth1): canceled DHCP transaction
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3737] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3737] dhcp4 (eth1): state changed no lease
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3760] policy: auto-activating connection 'ci-private-network' (1694c4d7-a3f7-50e2-b10a-6b7d951bb318)
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3768] device (eth1): Activation: starting connection 'ci-private-network' (1694c4d7-a3f7-50e2-b10a-6b7d951bb318)
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3771] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3777] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3790] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.3804] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.4053] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.4057] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:18:21 np0005463580.novalocal NetworkManager[3942]: <info>  [1759263501.4067] device (eth1): Activation: successful, device activated.
Sep 30 20:18:21 np0005463580.novalocal sshd-session[4037]: Disconnecting invalid user rob 185.217.1.246 port 23053: Change of username or service not allowed: (rob,ssh-connection) -> (joseph,ssh-connection) [preauth]
Sep 30 20:18:26 np0005463580.novalocal sshd-session[4062]: Invalid user joseph from 185.217.1.246 port 10703
Sep 30 20:18:27 np0005463580.novalocal sshd-session[4062]: Disconnecting invalid user joseph 185.217.1.246 port 10703: Change of username or service not allowed: (joseph,ssh-connection) -> (lido,ssh-connection) [preauth]
Sep 30 20:18:31 np0005463580.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:18:31 np0005463580.novalocal sshd-session[4064]: Invalid user lido from 185.217.1.246 port 62056
Sep 30 20:18:32 np0005463580.novalocal sshd-session[4064]: Disconnecting invalid user lido 185.217.1.246 port 62056: Change of username or service not allowed: (lido,ssh-connection) -> (root;yuchen!))&gt,ssh-connectio [preauth]
Sep 30 20:18:36 np0005463580.novalocal sshd-session[1069]: Received disconnect from 38.102.83.114 port 41080:11: disconnected by user
Sep 30 20:18:36 np0005463580.novalocal sshd-session[1069]: Disconnected from user zuul 38.102.83.114 port 41080
Sep 30 20:18:36 np0005463580.novalocal sshd-session[1056]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:18:36 np0005463580.novalocal systemd-logind[792]: Session 1 logged out. Waiting for processes to exit.
Sep 30 20:18:37 np0005463580.novalocal sshd-session[4066]: Invalid user root;yuchen!))&gt from 185.217.1.246 port 15745
Sep 30 20:18:38 np0005463580.novalocal sshd-session[4066]: Disconnecting invalid user root;yuchen!))&gt 185.217.1.246 port 15745: Change of username or service not allowed: (root;yuchen!))&gt,ssh-connection) -> (ether,ssh-connecti [preauth]
Sep 30 20:18:41 np0005463580.novalocal sshd-session[4068]: Invalid user ether from 185.217.1.246 port 64431
Sep 30 20:18:41 np0005463580.novalocal sshd-session[4068]: Disconnecting invalid user ether 185.217.1.246 port 64431: Change of username or service not allowed: (ether,ssh-connection) -> (user_1,ssh-connection) [preauth]
Sep 30 20:18:46 np0005463580.novalocal sshd-session[4070]: Invalid user user_1 from 185.217.1.246 port 27736
Sep 30 20:18:46 np0005463580.novalocal sshd-session[4070]: Disconnecting invalid user user_1 185.217.1.246 port 27736: Change of username or service not allowed: (user_1,ssh-connection) -> (validate,ssh-connection) [preauth]
Sep 30 20:18:50 np0005463580.novalocal sshd-session[4072]: Invalid user validate from 185.217.1.246 port 8754
Sep 30 20:18:50 np0005463580.novalocal sshd-session[4072]: Disconnecting invalid user validate 185.217.1.246 port 8754: Change of username or service not allowed: (validate,ssh-connection) -> (binance,ssh-connection) [preauth]
Sep 30 20:18:53 np0005463580.novalocal sshd-session[4074]: Invalid user binance from 185.217.1.246 port 41897
Sep 30 20:18:53 np0005463580.novalocal sshd-session[4074]: Disconnecting invalid user binance 185.217.1.246 port 41897: Change of username or service not allowed: (binance,ssh-connection) -> (user1,ssh-connection) [preauth]
Sep 30 20:18:59 np0005463580.novalocal sshd-session[4076]: Invalid user user1 from 185.217.1.246 port 18361
Sep 30 20:19:01 np0005463580.novalocal sshd-session[4076]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 18361 ssh2 [preauth]
Sep 30 20:19:01 np0005463580.novalocal sshd-session[4076]: Disconnecting invalid user user1 185.217.1.246 port 18361: Too many authentication failures [preauth]
Sep 30 20:19:04 np0005463580.novalocal sshd-session[4078]: Invalid user user1 from 185.217.1.246 port 60197
Sep 30 20:19:07 np0005463580.novalocal sshd-session[4078]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 60197 ssh2 [preauth]
Sep 30 20:19:07 np0005463580.novalocal sshd-session[4078]: Disconnecting invalid user user1 185.217.1.246 port 60197: Too many authentication failures [preauth]
Sep 30 20:19:11 np0005463580.novalocal sshd-session[4080]: Invalid user user1 from 185.217.1.246 port 41516
Sep 30 20:19:12 np0005463580.novalocal sshd-session[4080]: error: maximum authentication attempts exceeded for invalid user user1 from 185.217.1.246 port 41516 ssh2 [preauth]
Sep 30 20:19:12 np0005463580.novalocal sshd-session[4080]: Disconnecting invalid user user1 185.217.1.246 port 41516: Too many authentication failures [preauth]
Sep 30 20:19:15 np0005463580.novalocal sshd-session[4082]: Invalid user user1 from 185.217.1.246 port 19041
Sep 30 20:19:21 np0005463580.novalocal sshd-session[4082]: Disconnecting invalid user user1 185.217.1.246 port 19041: Change of username or service not allowed: (user1,ssh-connection) -> (dspace,ssh-connection) [preauth]
Sep 30 20:19:23 np0005463580.novalocal sshd-session[4084]: Invalid user dspace from 185.217.1.246 port 34427
Sep 30 20:19:23 np0005463580.novalocal sshd-session[4086]: Accepted publickey for zuul from 38.102.83.114 port 50754 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:19:23 np0005463580.novalocal systemd-logind[792]: New session 3 of user zuul.
Sep 30 20:19:23 np0005463580.novalocal systemd[1]: Started Session 3 of User zuul.
Sep 30 20:19:23 np0005463580.novalocal sshd-session[4086]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:19:24 np0005463580.novalocal sudo[4165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emknjfyecoekrlxqotcypohhhzkcemeq ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:19:24 np0005463580.novalocal sudo[4165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:19:24 np0005463580.novalocal python3[4167]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:19:24 np0005463580.novalocal sudo[4165]: pam_unix(sudo:session): session closed for user root
Sep 30 20:19:24 np0005463580.novalocal sudo[4238]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqztcoeomsuzawwnvxxyzryympgpsxrl ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 20:19:24 np0005463580.novalocal sudo[4238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:19:24 np0005463580.novalocal python3[4240]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759263564.0472372-365-26756468492568/source _original_basename=tmpbdi96ol7 follow=False checksum=fce3c23802257d751453fb1742d790946575ef6f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:19:24 np0005463580.novalocal sudo[4238]: pam_unix(sudo:session): session closed for user root
Sep 30 20:19:25 np0005463580.novalocal sshd-session[4084]: error: maximum authentication attempts exceeded for invalid user dspace from 185.217.1.246 port 34427 ssh2 [preauth]
Sep 30 20:19:25 np0005463580.novalocal sshd-session[4084]: Disconnecting invalid user dspace 185.217.1.246 port 34427: Too many authentication failures [preauth]
Sep 30 20:19:29 np0005463580.novalocal sshd-session[4089]: Connection closed by 38.102.83.114 port 50754
Sep 30 20:19:29 np0005463580.novalocal sshd-session[4086]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:19:29 np0005463580.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Sep 30 20:19:29 np0005463580.novalocal systemd-logind[792]: Session 3 logged out. Waiting for processes to exit.
Sep 30 20:19:29 np0005463580.novalocal systemd-logind[792]: Removed session 3.
Sep 30 20:19:30 np0005463580.novalocal sshd-session[4265]: Invalid user dspace from 185.217.1.246 port 14259
Sep 30 20:19:32 np0005463580.novalocal sshd-session[4265]: Disconnecting invalid user dspace 185.217.1.246 port 14259: Change of username or service not allowed: (dspace,ssh-connection) -> (ftp,ssh-connection) [preauth]
Sep 30 20:19:45 np0005463580.novalocal sshd-session[4267]: error: maximum authentication attempts exceeded for ftp from 185.217.1.246 port 5600 ssh2 [preauth]
Sep 30 20:19:45 np0005463580.novalocal sshd-session[4267]: Disconnecting authenticating user ftp 185.217.1.246 port 5600: Too many authentication failures [preauth]
Sep 30 20:19:52 np0005463580.novalocal sshd-session[4269]: Disconnecting authenticating user ftp 185.217.1.246 port 52529: Change of username or service not allowed: (ftp,ssh-connection) -> (trx,ssh-connection) [preauth]
Sep 30 20:19:54 np0005463580.novalocal sshd-session[4271]: Invalid user trx from 185.217.1.246 port 21881
Sep 30 20:19:54 np0005463580.novalocal sshd-session[4271]: Disconnecting invalid user trx 185.217.1.246 port 21881: Change of username or service not allowed: (trx,ssh-connection) -> (user15,ssh-connection) [preauth]
Sep 30 20:19:57 np0005463580.novalocal sshd-session[4273]: Invalid user user15 from 185.217.1.246 port 36815
Sep 30 20:19:59 np0005463580.novalocal sshd-session[4275]: Connection closed by authenticating user root 78.128.112.74 port 41204 [preauth]
Sep 30 20:19:59 np0005463580.novalocal sshd-session[4273]: Disconnecting invalid user user15 185.217.1.246 port 36815: Change of username or service not allowed: (user15,ssh-connection) -> (cirros,ssh-connection) [preauth]
Sep 30 20:20:03 np0005463580.novalocal sshd-session[4277]: Invalid user cirros from 185.217.1.246 port 18269
Sep 30 20:20:04 np0005463580.novalocal sshd-session[4277]: Disconnecting invalid user cirros 185.217.1.246 port 18269: Change of username or service not allowed: (cirros,ssh-connection) -> (USER1,ssh-connection) [preauth]
Sep 30 20:20:10 np0005463580.novalocal sshd-session[4279]: Invalid user USER1 from 185.217.1.246 port 4658
Sep 30 20:20:11 np0005463580.novalocal sshd-session[4279]: Disconnecting invalid user USER1 185.217.1.246 port 4658: Change of username or service not allowed: (USER1,ssh-connection) -> (eth,ssh-connection) [preauth]
Sep 30 20:20:15 np0005463580.novalocal sshd-session[4281]: Invalid user eth from 185.217.1.246 port 59086
Sep 30 20:20:15 np0005463580.novalocal sshd-session[4281]: Disconnecting invalid user eth 185.217.1.246 port 59086: Change of username or service not allowed: (eth,ssh-connection) -> (david,ssh-connection) [preauth]
Sep 30 20:20:16 np0005463580.novalocal sshd-session[4283]: Invalid user david from 185.217.1.246 port 9881
Sep 30 20:20:16 np0005463580.novalocal sshd-session[4283]: Disconnecting invalid user david 185.217.1.246 port 9881: Change of username or service not allowed: (david,ssh-connection) -> (oracle,ssh-connection) [preauth]
Sep 30 20:20:21 np0005463580.novalocal sshd-session[4285]: Invalid user oracle from 185.217.1.246 port 26054
Sep 30 20:20:23 np0005463580.novalocal sshd-session[4285]: Disconnecting invalid user oracle 185.217.1.246 port 26054: Change of username or service not allowed: (oracle,ssh-connection) -> (miner,ssh-connection) [preauth]
Sep 30 20:20:26 np0005463580.novalocal sshd-session[4287]: Invalid user miner from 185.217.1.246 port 18472
Sep 30 20:20:26 np0005463580.novalocal sshd-session[4287]: Disconnecting invalid user miner 185.217.1.246 port 18472: Change of username or service not allowed: (miner,ssh-connection) -> (admin,ssh-connection) [preauth]
Sep 30 20:20:32 np0005463580.novalocal sshd-session[4289]: Invalid user admin from 185.217.1.246 port 64234
Sep 30 20:20:36 np0005463580.novalocal sshd-session[4289]: error: maximum authentication attempts exceeded for invalid user admin from 185.217.1.246 port 64234 ssh2 [preauth]
Sep 30 20:20:36 np0005463580.novalocal sshd-session[4289]: Disconnecting invalid user admin 185.217.1.246 port 64234: Too many authentication failures [preauth]
Sep 30 20:20:39 np0005463580.novalocal sshd-session[4291]: Invalid user admin from 185.217.1.246 port 50063
Sep 30 20:20:40 np0005463580.novalocal sshd-session[4291]: error: maximum authentication attempts exceeded for invalid user admin from 185.217.1.246 port 50063 ssh2 [preauth]
Sep 30 20:20:40 np0005463580.novalocal sshd-session[4291]: Disconnecting invalid user admin 185.217.1.246 port 50063: Too many authentication failures [preauth]
Sep 30 20:20:44 np0005463580.novalocal sshd-session[4294]: Invalid user admin from 185.217.1.246 port 26761
Sep 30 20:20:45 np0005463580.novalocal sshd-session[4294]: Disconnecting invalid user admin 185.217.1.246 port 26761: Change of username or service not allowed: (admin,ssh-connection) -> (xbmc,ssh-connection) [preauth]
Sep 30 20:20:50 np0005463580.novalocal sshd-session[4296]: Invalid user xbmc from 185.217.1.246 port 62902
Sep 30 20:20:51 np0005463580.novalocal sshd-session[4296]: Disconnecting invalid user xbmc 185.217.1.246 port 62902: Change of username or service not allowed: (xbmc,ssh-connection) -> (user21,ssh-connection) [preauth]
Sep 30 20:20:53 np0005463580.novalocal sshd-session[4298]: Invalid user user21 from 185.217.1.246 port 42438
Sep 30 20:20:55 np0005463580.novalocal sshd-session[4298]: Disconnecting invalid user user21 185.217.1.246 port 42438: Change of username or service not allowed: (user21,ssh-connection) -> (user-1,ssh-connection) [preauth]
Sep 30 20:21:01 np0005463580.novalocal systemd[1060]: Created slice User Background Tasks Slice.
Sep 30 20:21:01 np0005463580.novalocal systemd[1060]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 20:21:01 np0005463580.novalocal systemd[1060]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 20:21:02 np0005463580.novalocal sshd-session[4300]: Invalid user user-1 from 185.217.1.246 port 36487
Sep 30 20:21:03 np0005463580.novalocal sshd-session[4300]: Disconnecting invalid user user-1 185.217.1.246 port 36487: Change of username or service not allowed: (user-1,ssh-connection) -> (authority,ssh-connection) [preauth]
Sep 30 20:21:05 np0005463580.novalocal sshd-session[4305]: Invalid user authority from 185.217.1.246 port 13597
Sep 30 20:21:05 np0005463580.novalocal sshd-session[4305]: Disconnecting invalid user authority 185.217.1.246 port 13597: Change of username or service not allowed: (authority,ssh-connection) -> (guest,ssh-connection) [preauth]
Sep 30 20:21:07 np0005463580.novalocal sshd-session[4307]: Invalid user guest from 185.217.1.246 port 33420
Sep 30 20:21:10 np0005463580.novalocal sshd-session[4307]: error: maximum authentication attempts exceeded for invalid user guest from 185.217.1.246 port 33420 ssh2 [preauth]
Sep 30 20:21:10 np0005463580.novalocal sshd-session[4307]: Disconnecting invalid user guest 185.217.1.246 port 33420: Too many authentication failures [preauth]
Sep 30 20:21:12 np0005463580.novalocal sshd-session[4309]: Invalid user guest from 185.217.1.246 port 12340
Sep 30 20:21:12 np0005463580.novalocal sshd-session[4309]: Disconnecting invalid user guest 185.217.1.246 port 12340: Change of username or service not allowed: (guest,ssh-connection) -> (operator,ssh-connection) [preauth]
Sep 30 20:21:16 np0005463580.novalocal sshd-session[4311]: error: maximum authentication attempts exceeded for operator from 185.217.1.246 port 29801 ssh2 [preauth]
Sep 30 20:21:16 np0005463580.novalocal sshd-session[4311]: Disconnecting authenticating user operator 185.217.1.246 port 29801: Too many authentication failures [preauth]
Sep 30 20:21:20 np0005463580.novalocal sshd-session[4314]: Disconnecting authenticating user operator 185.217.1.246 port 58948: Change of username or service not allowed: (operator,ssh-connection) -> (web3,ssh-connection) [preauth]
Sep 30 20:21:31 np0005463580.novalocal sshd-session[4316]: Invalid user web3 from 185.217.1.246 port 40583
Sep 30 20:21:31 np0005463580.novalocal sshd-session[4316]: Disconnecting invalid user web3 185.217.1.246 port 40583: Change of username or service not allowed: (web3,ssh-connection) -> (james,ssh-connection) [preauth]
Sep 30 20:21:33 np0005463580.novalocal sshd-session[4318]: Invalid user james from 185.217.1.246 port 56035
Sep 30 20:21:34 np0005463580.novalocal sshd-session[4318]: Disconnecting invalid user james 185.217.1.246 port 56035: Change of username or service not allowed: (james,ssh-connection) -> (robert,ssh-connection) [preauth]
Sep 30 20:21:36 np0005463580.novalocal sshd-session[4320]: Invalid user robert from 185.217.1.246 port 18500
Sep 30 20:21:37 np0005463580.novalocal sshd-session[4320]: Disconnecting invalid user robert 185.217.1.246 port 18500: Change of username or service not allowed: (robert,ssh-connection) -> (monitor,ssh-connection) [preauth]
Sep 30 20:21:39 np0005463580.novalocal sshd-session[4322]: Invalid user monitor from 185.217.1.246 port 32517
Sep 30 20:21:39 np0005463580.novalocal sshd-session[4322]: Disconnecting invalid user monitor 185.217.1.246 port 32517: Change of username or service not allowed: (monitor,ssh-connection) -> (michael,ssh-connection) [preauth]
Sep 30 20:21:43 np0005463580.novalocal sshd-session[4324]: Invalid user michael from 185.217.1.246 port 3107
Sep 30 20:21:44 np0005463580.novalocal sshd-session[4324]: Disconnecting invalid user michael 185.217.1.246 port 3107: Change of username or service not allowed: (michael,ssh-connection) -> (pi,ssh-connection) [preauth]
Sep 30 20:21:47 np0005463580.novalocal sshd-session[4326]: Invalid user pi from 185.217.1.246 port 44891
Sep 30 20:21:48 np0005463580.novalocal sshd-session[4326]: Disconnecting invalid user pi 185.217.1.246 port 44891: Change of username or service not allowed: (pi,ssh-connection) -> (User,ssh-connection) [preauth]
Sep 30 20:21:53 np0005463580.novalocal sshd-session[4328]: Invalid user User from 185.217.1.246 port 6057
Sep 30 20:21:56 np0005463580.novalocal sshd-session[4328]: Disconnecting invalid user User 185.217.1.246 port 6057: Change of username or service not allowed: (User,ssh-connection) -> (william,ssh-connection) [preauth]
Sep 30 20:22:02 np0005463580.novalocal sshd-session[4330]: Invalid user william from 185.217.1.246 port 9057
Sep 30 20:22:02 np0005463580.novalocal sshd-session[4330]: Disconnecting invalid user william 185.217.1.246 port 9057: Change of username or service not allowed: (william,ssh-connection) -> (debian,ssh-connection) [preauth]
Sep 30 20:22:05 np0005463580.novalocal sshd-session[4332]: Invalid user debian from 185.217.1.246 port 53655
Sep 30 20:22:06 np0005463580.novalocal sshd-session[4332]: Disconnecting invalid user debian 185.217.1.246 port 53655: Change of username or service not allowed: (debian,ssh-connection) -> (usdt,ssh-connection) [preauth]
Sep 30 20:22:10 np0005463580.novalocal sshd-session[4334]: Invalid user usdt from 185.217.1.246 port 26997
Sep 30 20:22:11 np0005463580.novalocal sshd-session[4334]: Disconnecting invalid user usdt 185.217.1.246 port 26997: Change of username or service not allowed: (usdt,ssh-connection) -> (user14,ssh-connection) [preauth]
Sep 30 20:22:15 np0005463580.novalocal sshd-session[4336]: Invalid user user14 from 185.217.1.246 port 6995
Sep 30 20:22:16 np0005463580.novalocal sshd-session[4336]: Disconnecting invalid user user14 185.217.1.246 port 6995: Change of username or service not allowed: (user14,ssh-connection) -> (user12,ssh-connection) [preauth]
Sep 30 20:22:17 np0005463580.novalocal sshd-session[4338]: Invalid user user12 from 185.217.1.246 port 36328
Sep 30 20:22:17 np0005463580.novalocal sshd-session[4338]: Disconnecting invalid user user12 185.217.1.246 port 36328: Change of username or service not allowed: (user12,ssh-connection) -> (usdc,ssh-connection) [preauth]
Sep 30 20:22:19 np0005463580.novalocal sshd-session[4340]: Invalid user usdc from 185.217.1.246 port 43419
Sep 30 20:22:19 np0005463580.novalocal sshd-session[4340]: Disconnecting invalid user usdc 185.217.1.246 port 43419: Change of username or service not allowed: (usdc,ssh-connection) -> (mina,ssh-connection) [preauth]
Sep 30 20:22:22 np0005463580.novalocal sshd-session[4344]: Invalid user manager from 185.156.73.233 port 55588
Sep 30 20:22:22 np0005463580.novalocal sshd-session[4344]: Connection closed by invalid user manager 185.156.73.233 port 55588 [preauth]
Sep 30 20:22:23 np0005463580.novalocal sshd-session[4342]: Invalid user mina from 185.217.1.246 port 54164
Sep 30 20:22:23 np0005463580.novalocal sshd-session[4342]: Disconnecting invalid user mina 185.217.1.246 port 54164: Change of username or service not allowed: (mina,ssh-connection) -> (sync,ssh-connection) [preauth]
Sep 30 20:22:28 np0005463580.novalocal sshd-session[4346]: Disconnecting authenticating user sync 185.217.1.246 port 27222: Change of username or service not allowed: (sync,ssh-connection) -> (ethereumdocker,ssh-connection) [preauth]
Sep 30 20:22:33 np0005463580.novalocal sshd-session[4348]: Invalid user ethereumdocker from 185.217.1.246 port 16546
Sep 30 20:22:34 np0005463580.novalocal sshd-session[4348]: Disconnecting invalid user ethereumdocker 185.217.1.246 port 16546: Change of username or service not allowed: (ethereumdocker,ssh-connection) -> (tazos,ssh-connection) [preauth]
Sep 30 20:22:37 np0005463580.novalocal sshd-session[4350]: Invalid user tazos from 185.217.1.246 port 63258
Sep 30 20:22:37 np0005463580.novalocal sshd-session[4350]: Disconnecting invalid user tazos 185.217.1.246 port 63258: Change of username or service not allowed: (tazos,ssh-connection) -> (jenkins,ssh-connection) [preauth]
Sep 30 20:22:42 np0005463580.novalocal sshd-session[4352]: Invalid user jenkins from 185.217.1.246 port 19652
Sep 30 20:22:45 np0005463580.novalocal sshd-session[4352]: error: maximum authentication attempts exceeded for invalid user jenkins from 185.217.1.246 port 19652 ssh2 [preauth]
Sep 30 20:22:45 np0005463580.novalocal sshd-session[4352]: Disconnecting invalid user jenkins 185.217.1.246 port 19652: Too many authentication failures [preauth]
Sep 30 20:22:47 np0005463580.novalocal sshd-session[4354]: Invalid user jenkins from 185.217.1.246 port 16146
Sep 30 20:22:47 np0005463580.novalocal sshd-session[4354]: Disconnecting invalid user jenkins 185.217.1.246 port 16146: Change of username or service not allowed: (jenkins,ssh-connection) -> (teamspeak,ssh-connection) [preauth]
Sep 30 20:22:49 np0005463580.novalocal sshd-session[4356]: Invalid user teamspeak from 185.217.1.246 port 35925
Sep 30 20:22:49 np0005463580.novalocal sshd-session[4356]: Disconnecting invalid user teamspeak 185.217.1.246 port 35925: Change of username or service not allowed: (teamspeak,ssh-connection) -> (uniswap,ssh-connection) [preauth]
Sep 30 20:22:53 np0005463580.novalocal sshd-session[4358]: Invalid user uniswap from 185.217.1.246 port 55082
Sep 30 20:22:54 np0005463580.novalocal sshd-session[4358]: Disconnecting invalid user uniswap 185.217.1.246 port 55082: Change of username or service not allowed: (uniswap,ssh-connection) -> (volumio,ssh-connection) [preauth]
Sep 30 20:22:57 np0005463580.novalocal sshd-session[4361]: Invalid user volumio from 185.217.1.246 port 29919
Sep 30 20:22:57 np0005463580.novalocal sshd-session[4361]: Disconnecting invalid user volumio 185.217.1.246 port 29919: Change of username or service not allowed: (volumio,ssh-connection) -> (fa,ssh-connection) [preauth]
Sep 30 20:22:59 np0005463580.novalocal sshd-session[4363]: Invalid user fa from 185.217.1.246 port 54735
Sep 30 20:23:00 np0005463580.novalocal sshd-session[4363]: Disconnecting invalid user fa 185.217.1.246 port 54735: Change of username or service not allowed: (fa,ssh-connection) -> (thomas,ssh-connection) [preauth]
Sep 30 20:23:03 np0005463580.novalocal sshd-session[4365]: Invalid user thomas from 185.217.1.246 port 6306
Sep 30 20:23:03 np0005463580.novalocal sshd-session[4365]: Disconnecting invalid user thomas 185.217.1.246 port 6306: Change of username or service not allowed: (thomas,ssh-connection) -> (mine,ssh-connection) [preauth]
Sep 30 20:23:07 np0005463580.novalocal sshd-session[4367]: Invalid user mine from 185.217.1.246 port 45277
Sep 30 20:23:07 np0005463580.novalocal sshd-session[4367]: Disconnecting invalid user mine 185.217.1.246 port 45277: Change of username or service not allowed: (mine,ssh-connection) -> (ltc,ssh-connection) [preauth]
Sep 30 20:23:09 np0005463580.novalocal sshd-session[4369]: Invalid user ltc from 185.217.1.246 port 10527
Sep 30 20:23:09 np0005463580.novalocal sshd-session[4369]: Disconnecting invalid user ltc 185.217.1.246 port 10527: Change of username or service not allowed: (ltc,ssh-connection) -> (terraform,ssh-connection) [preauth]
Sep 30 20:23:11 np0005463580.novalocal sshd-session[4371]: Invalid user terraform from 185.217.1.246 port 20471
Sep 30 20:23:12 np0005463580.novalocal sshd-session[4371]: Disconnecting invalid user terraform 185.217.1.246 port 20471: Change of username or service not allowed: (terraform,ssh-connection) -> (postgres,ssh-connection) [preauth]
Sep 30 20:23:17 np0005463580.novalocal sshd-session[4373]: Invalid user postgres from 185.217.1.246 port 58767
Sep 30 20:23:18 np0005463580.novalocal sshd-session[4373]: Disconnecting invalid user postgres 185.217.1.246 port 58767: Change of username or service not allowed: (postgres,ssh-connection) -> (kraken,ssh-connection) [preauth]
Sep 30 20:23:19 np0005463580.novalocal sshd-session[4375]: Invalid user kraken from 185.217.1.246 port 31972
Sep 30 20:23:20 np0005463580.novalocal sshd-session[4375]: Disconnecting invalid user kraken 185.217.1.246 port 31972: Change of username or service not allowed: (kraken,ssh-connection) -> (alarm,ssh-connection) [preauth]
Sep 30 20:23:25 np0005463580.novalocal sshd-session[4377]: Invalid user alarm from 185.217.1.246 port 9482
Sep 30 20:23:25 np0005463580.novalocal sshd-session[4377]: Disconnecting invalid user alarm 185.217.1.246 port 9482: Change of username or service not allowed: (alarm,ssh-connection) -> (hpc-riscv,ssh-connection) [preauth]
Sep 30 20:23:28 np0005463580.novalocal sshd-session[4379]: Invalid user hpc-riscv from 185.217.1.246 port 43178
Sep 30 20:23:29 np0005463580.novalocal sshd-session[4379]: Disconnecting invalid user hpc-riscv 185.217.1.246 port 43178: Change of username or service not allowed: (hpc-riscv,ssh-connection) -> (user20,ssh-connection) [preauth]
Sep 30 20:23:30 np0005463580.novalocal sshd-session[4381]: Invalid user user20 from 185.217.1.246 port 57230
Sep 30 20:23:30 np0005463580.novalocal sshd-session[4381]: Disconnecting invalid user user20 185.217.1.246 port 57230: Change of username or service not allowed: (user20,ssh-connection) -> (ethos,ssh-connection) [preauth]
Sep 30 20:23:33 np0005463580.novalocal sshd-session[4383]: Invalid user ethos from 185.217.1.246 port 2908
Sep 30 20:23:33 np0005463580.novalocal sshd-session[4383]: Disconnecting invalid user ethos 185.217.1.246 port 2908: Change of username or service not allowed: (ethos,ssh-connection) -> (okx,ssh-connection) [preauth]
Sep 30 20:23:38 np0005463580.novalocal sshd-session[4385]: Invalid user okx from 185.217.1.246 port 38769
Sep 30 20:23:39 np0005463580.novalocal sshd-session[4385]: Disconnecting invalid user okx 185.217.1.246 port 38769: Change of username or service not allowed: (okx,ssh-connection) -> (sol,ssh-connection) [preauth]
Sep 30 20:23:43 np0005463580.novalocal sshd-session[4387]: Invalid user sol from 185.217.1.246 port 22430
Sep 30 20:23:43 np0005463580.novalocal sshd-session[4387]: Disconnecting invalid user sol 185.217.1.246 port 22430: Change of username or service not allowed: (sol,ssh-connection) -> (cs2sv,ssh-connection) [preauth]
Sep 30 20:23:45 np0005463580.novalocal sshd-session[4389]: Invalid user cs2sv from 185.217.1.246 port 52496
Sep 30 20:23:46 np0005463580.novalocal sshd-session[4389]: Disconnecting invalid user cs2sv 185.217.1.246 port 52496: Change of username or service not allowed: (cs2sv,ssh-connection) -> (ada,ssh-connection) [preauth]
Sep 30 20:23:49 np0005463580.novalocal sshd-session[4391]: Invalid user ada from 185.217.1.246 port 11675
Sep 30 20:23:49 np0005463580.novalocal sshd-session[4391]: Disconnecting invalid user ada 185.217.1.246 port 11675: Change of username or service not allowed: (ada,ssh-connection) -> (git,ssh-connection) [preauth]
Sep 30 20:23:51 np0005463580.novalocal sshd-session[4393]: Invalid user git from 185.217.1.246 port 31639
Sep 30 20:23:54 np0005463580.novalocal sshd-session[4393]: error: maximum authentication attempts exceeded for invalid user git from 185.217.1.246 port 31639 ssh2 [preauth]
Sep 30 20:23:54 np0005463580.novalocal sshd-session[4393]: Disconnecting invalid user git 185.217.1.246 port 31639: Too many authentication failures [preauth]
Sep 30 20:23:57 np0005463580.novalocal sshd-session[4395]: Invalid user git from 185.217.1.246 port 9115
Sep 30 20:23:58 np0005463580.novalocal sshd-session[4395]: Disconnecting invalid user git 185.217.1.246 port 9115: Change of username or service not allowed: (git,ssh-connection) -> (user123,ssh-connection) [preauth]
Sep 30 20:24:00 np0005463580.novalocal sshd-session[4397]: Invalid user user123 from 185.217.1.246 port 47494
Sep 30 20:24:01 np0005463580.novalocal sshd-session[4397]: Disconnecting invalid user user123 185.217.1.246 port 47494: Change of username or service not allowed: (user123,ssh-connection) -> (kali,ssh-connection) [preauth]
Sep 30 20:24:06 np0005463580.novalocal sshd-session[4399]: Invalid user kali from 185.217.1.246 port 26649
Sep 30 20:24:06 np0005463580.novalocal sshd-session[4399]: Disconnecting invalid user kali 185.217.1.246 port 26649: Change of username or service not allowed: (kali,ssh-connection) -> (tomcat,ssh-connection) [preauth]
Sep 30 20:24:10 np0005463580.novalocal sshd-session[4401]: Invalid user tomcat from 185.217.1.246 port 52212
Sep 30 20:24:12 np0005463580.novalocal sshd-session[4401]: error: maximum authentication attempts exceeded for invalid user tomcat from 185.217.1.246 port 52212 ssh2 [preauth]
Sep 30 20:24:12 np0005463580.novalocal sshd-session[4401]: Disconnecting invalid user tomcat 185.217.1.246 port 52212: Too many authentication failures [preauth]
Sep 30 20:24:16 np0005463580.novalocal sshd-session[4403]: Invalid user tomcat from 185.217.1.246 port 47197
Sep 30 20:24:17 np0005463580.novalocal sshd-session[4403]: Disconnecting invalid user tomcat 185.217.1.246 port 47197: Change of username or service not allowed: (tomcat,ssh-connection) -> (user19,ssh-connection) [preauth]
Sep 30 20:24:19 np0005463580.novalocal sshd-session[4405]: Invalid user user19 from 185.217.1.246 port 5678
Sep 30 20:24:19 np0005463580.novalocal sshd-session[4405]: Disconnecting invalid user user19 185.217.1.246 port 5678: Change of username or service not allowed: (user19,ssh-connection) -> (delegate,ssh-connection) [preauth]
Sep 30 20:24:22 np0005463580.novalocal sshd-session[4407]: Invalid user delegate from 185.217.1.246 port 27950
Sep 30 20:24:23 np0005463580.novalocal sshd-session[4407]: Disconnecting invalid user delegate 185.217.1.246 port 27950: Change of username or service not allowed: (delegate,ssh-connection) -> (backup,ssh-connection) [preauth]
Sep 30 20:24:26 np0005463580.novalocal sshd-session[4409]: Invalid user backup from 185.217.1.246 port 2772
Sep 30 20:24:31 np0005463580.novalocal sshd-session[4409]: error: maximum authentication attempts exceeded for invalid user backup from 185.217.1.246 port 2772 ssh2 [preauth]
Sep 30 20:24:31 np0005463580.novalocal sshd-session[4409]: Disconnecting invalid user backup 185.217.1.246 port 2772: Too many authentication failures [preauth]
Sep 30 20:24:37 np0005463580.novalocal sshd-session[4411]: Invalid user backup from 185.217.1.246 port 4194
Sep 30 20:24:40 np0005463580.novalocal sshd-session[4411]: Disconnecting invalid user backup 185.217.1.246 port 4194: Change of username or service not allowed: (backup,ssh-connection) -> (mysql,ssh-connection) [preauth]
Sep 30 20:24:43 np0005463580.novalocal sshd-session[4413]: Invalid user mysql from 185.217.1.246 port 4340
Sep 30 20:24:46 np0005463580.novalocal sshd-session[4413]: Disconnecting invalid user mysql 185.217.1.246 port 4340: Change of username or service not allowed: (mysql,ssh-connection) -> (ethereum,ssh-connection) [preauth]
Sep 30 20:24:49 np0005463580.novalocal sshd-session[4415]: Invalid user ethereum from 185.217.1.246 port 60362
Sep 30 20:24:50 np0005463580.novalocal sshd-session[4415]: Disconnecting invalid user ethereum 185.217.1.246 port 60362: Change of username or service not allowed: (ethereum,ssh-connection) -> (doge,ssh-connection) [preauth]
Sep 30 20:24:51 np0005463580.novalocal sshd-session[4417]: Invalid user doge from 185.217.1.246 port 21880
Sep 30 20:24:51 np0005463580.novalocal sshd-session[4417]: Disconnecting invalid user doge 185.217.1.246 port 21880: Change of username or service not allowed: (doge,ssh-connection) -> (user2,ssh-connection) [preauth]
Sep 30 20:24:53 np0005463580.novalocal sshd-session[4419]: Invalid user user2 from 185.217.1.246 port 29241
Sep 30 20:24:54 np0005463580.novalocal sshd-session[4419]: Disconnecting invalid user user2 185.217.1.246 port 29241: Change of username or service not allowed: (user2,ssh-connection) -> (root,ssh-connection) [preauth]
Sep 30 20:25:02 np0005463580.novalocal sshd-session[4421]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 63816 ssh2 [preauth]
Sep 30 20:25:02 np0005463580.novalocal sshd-session[4421]: Disconnecting authenticating user root 185.217.1.246 port 63816: Too many authentication failures [preauth]
Sep 30 20:25:09 np0005463580.novalocal sshd-session[4423]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 59416 ssh2 [preauth]
Sep 30 20:25:09 np0005463580.novalocal sshd-session[4423]: Disconnecting authenticating user root 185.217.1.246 port 59416: Too many authentication failures [preauth]
Sep 30 20:25:15 np0005463580.novalocal sshd-session[4428]: Accepted publickey for zuul from 38.102.83.114 port 42342 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:25:15 np0005463580.novalocal systemd-logind[792]: New session 4 of user zuul.
Sep 30 20:25:15 np0005463580.novalocal systemd[1]: Started Session 4 of User zuul.
Sep 30 20:25:15 np0005463580.novalocal sshd-session[4428]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:25:15 np0005463580.novalocal sudo[4455]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znuuzkjrehzqmoivfikukmlkduncikom ; /usr/bin/python3'
Sep 30 20:25:15 np0005463580.novalocal sudo[4455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:15 np0005463580.novalocal python3[4457]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e819-e42a-000000000ca0-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:15 np0005463580.novalocal sudo[4455]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:16 np0005463580.novalocal sudo[4484]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuioxvqcyglpqnbqbfsizhxwxhwpuozf ; /usr/bin/python3'
Sep 30 20:25:16 np0005463580.novalocal sudo[4484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:16 np0005463580.novalocal python3[4486]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:16 np0005463580.novalocal sudo[4484]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:16 np0005463580.novalocal sudo[4510]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlpqxoxkkldxxtsrtuysqpmwounlaxgo ; /usr/bin/python3'
Sep 30 20:25:16 np0005463580.novalocal sudo[4510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:16 np0005463580.novalocal python3[4512]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:16 np0005463580.novalocal sudo[4510]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:16 np0005463580.novalocal sudo[4536]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsvfnhvivdsylxuxcwwaeernwmqdtswl ; /usr/bin/python3'
Sep 30 20:25:16 np0005463580.novalocal sudo[4536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:16 np0005463580.novalocal python3[4538]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:16 np0005463580.novalocal sudo[4536]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:17 np0005463580.novalocal sudo[4562]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esbggihsbvjlydukmdgvcofjraopfbhu ; /usr/bin/python3'
Sep 30 20:25:17 np0005463580.novalocal sudo[4562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:17 np0005463580.novalocal python3[4564]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:17 np0005463580.novalocal sudo[4562]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:17 np0005463580.novalocal sudo[4588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mitxjbedjxyqrqioeoupphstjquexjpk ; /usr/bin/python3'
Sep 30 20:25:17 np0005463580.novalocal sudo[4588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:17 np0005463580.novalocal python3[4590]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:25:17 np0005463580.novalocal python3[4590]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Sep 30 20:25:17 np0005463580.novalocal sudo[4588]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:18 np0005463580.novalocal sudo[4614]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcyuautudmvfphwoytpmhnmcajtpfyan ; /usr/bin/python3'
Sep 30 20:25:18 np0005463580.novalocal sudo[4614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:18 np0005463580.novalocal python3[4616]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 20:25:18 np0005463580.novalocal systemd[1]: Reloading.
Sep 30 20:25:18 np0005463580.novalocal systemd-rc-local-generator[4635]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:25:19 np0005463580.novalocal sudo[4614]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:19 np0005463580.novalocal sshd-session[4425]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58937 ssh2 [preauth]
Sep 30 20:25:19 np0005463580.novalocal sshd-session[4425]: Disconnecting authenticating user root 185.217.1.246 port 58937: Too many authentication failures [preauth]
Sep 30 20:25:20 np0005463580.novalocal sudo[4671]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtlisngzmpyxzrdurrryltzjwnqhnrnf ; /usr/bin/python3'
Sep 30 20:25:20 np0005463580.novalocal sudo[4671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:20 np0005463580.novalocal python3[4673]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Sep 30 20:25:20 np0005463580.novalocal sudo[4671]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:20 np0005463580.novalocal sudo[4697]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilawgcsgyktwyszeywefuzxeggnrnqdy ; /usr/bin/python3'
Sep 30 20:25:20 np0005463580.novalocal sudo[4697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:20 np0005463580.novalocal python3[4699]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:20 np0005463580.novalocal sudo[4697]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:20 np0005463580.novalocal sudo[4726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgrfskafeogcfghbkbzcdccxsvwwyrrr ; /usr/bin/python3'
Sep 30 20:25:20 np0005463580.novalocal sudo[4726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463580.novalocal python3[4728]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463580.novalocal sudo[4726]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:21 np0005463580.novalocal sudo[4754]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmewccasvhwreuawkdxdlyyargztwxg ; /usr/bin/python3'
Sep 30 20:25:21 np0005463580.novalocal sudo[4754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463580.novalocal python3[4756]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463580.novalocal sudo[4754]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:21 np0005463580.novalocal sudo[4782]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoyqngwgntgzoqrtbmkbmwnrurcbixww ; /usr/bin/python3'
Sep 30 20:25:21 np0005463580.novalocal sudo[4782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:21 np0005463580.novalocal python3[4784]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:21 np0005463580.novalocal sudo[4782]: pam_unix(sudo:session): session closed for user root
Sep 30 20:25:22 np0005463580.novalocal python3[4811]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-e819-e42a-000000000ca6-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:25:22 np0005463580.novalocal python3[4841]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:25:24 np0005463580.novalocal sshd-session[4647]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6762 ssh2 [preauth]
Sep 30 20:25:24 np0005463580.novalocal sshd-session[4647]: Disconnecting authenticating user root 185.217.1.246 port 6762: Too many authentication failures [preauth]
Sep 30 20:25:25 np0005463580.novalocal sshd-session[4431]: Connection closed by 38.102.83.114 port 42342
Sep 30 20:25:25 np0005463580.novalocal sshd-session[4428]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:25:25 np0005463580.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Sep 30 20:25:25 np0005463580.novalocal systemd[1]: session-4.scope: Consumed 3.773s CPU time.
Sep 30 20:25:25 np0005463580.novalocal systemd-logind[792]: Session 4 logged out. Waiting for processes to exit.
Sep 30 20:25:25 np0005463580.novalocal systemd-logind[792]: Removed session 4.
Sep 30 20:25:27 np0005463580.novalocal sshd-session[4846]: Accepted publickey for zuul from 38.102.83.114 port 47832 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:25:27 np0005463580.novalocal systemd-logind[792]: New session 5 of user zuul.
Sep 30 20:25:27 np0005463580.novalocal systemd[1]: Started Session 5 of User zuul.
Sep 30 20:25:27 np0005463580.novalocal sshd-session[4846]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:25:27 np0005463580.novalocal sudo[4874]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrrcjdfeulwvqgebyozozzmvaheyxsn ; /usr/bin/python3'
Sep 30 20:25:27 np0005463580.novalocal sudo[4874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:25:27 np0005463580.novalocal python3[4876]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 20:25:32 np0005463580.novalocal sshd-session[4844]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 61162 ssh2 [preauth]
Sep 30 20:25:32 np0005463580.novalocal sshd-session[4844]: Disconnecting authenticating user root 185.217.1.246 port 61162: Too many authentication failures [preauth]
Sep 30 20:25:37 np0005463580.novalocal sshd-session[4887]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 46384 ssh2 [preauth]
Sep 30 20:25:37 np0005463580.novalocal sshd-session[4887]: Disconnecting authenticating user root 185.217.1.246 port 46384: Too many authentication failures [preauth]
Sep 30 20:25:42 np0005463580.novalocal sshd-session[4918]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20245 ssh2 [preauth]
Sep 30 20:25:42 np0005463580.novalocal sshd-session[4918]: Disconnecting authenticating user root 185.217.1.246 port 20245: Too many authentication failures [preauth]
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:25:50 np0005463580.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:25:51 np0005463580.novalocal sshd-session[4921]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 39442 ssh2 [preauth]
Sep 30 20:25:51 np0005463580.novalocal sshd-session[4921]: Disconnecting authenticating user root 185.217.1.246 port 39442: Too many authentication failures [preauth]
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:26:01 np0005463580.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:26:02 np0005463580.novalocal sshd-session[4930]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 18929 ssh2 [preauth]
Sep 30 20:26:02 np0005463580.novalocal sshd-session[4930]: Disconnecting authenticating user root 185.217.1.246 port 18929: Too many authentication failures [preauth]
Sep 30 20:26:05 np0005463580.novalocal sshd-session[4939]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 35704 ssh2 [preauth]
Sep 30 20:26:05 np0005463580.novalocal sshd-session[4939]: Disconnecting authenticating user root 185.217.1.246 port 35704: Too many authentication failures [preauth]
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  Converting 363 SID table entries...
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:26:13 np0005463580.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:26:13 np0005463580.novalocal sshd-session[4942]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 61035 ssh2 [preauth]
Sep 30 20:26:13 np0005463580.novalocal sshd-session[4942]: Disconnecting authenticating user root 185.217.1.246 port 61035: Too many authentication failures [preauth]
Sep 30 20:26:16 np0005463580.novalocal setsebool[4953]: The virt_use_nfs policy boolean was changed to 1 by root
Sep 30 20:26:16 np0005463580.novalocal setsebool[4953]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Sep 30 20:26:18 np0005463580.novalocal sshd-session[4954]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 64718 ssh2 [preauth]
Sep 30 20:26:18 np0005463580.novalocal sshd-session[4954]: Disconnecting authenticating user root 185.217.1.246 port 64718: Too many authentication failures [preauth]
Sep 30 20:26:22 np0005463580.novalocal sshd-session[4963]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 36586 ssh2 [preauth]
Sep 30 20:26:22 np0005463580.novalocal sshd-session[4963]: Disconnecting authenticating user root 185.217.1.246 port 36586: Too many authentication failures [preauth]
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  Converting 366 SID table entries...
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:26:27 np0005463580.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:26:30 np0005463580.novalocal sshd-session[4966]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14792 ssh2 [preauth]
Sep 30 20:26:30 np0005463580.novalocal sshd-session[4966]: Disconnecting authenticating user root 185.217.1.246 port 14792: Too many authentication failures [preauth]
Sep 30 20:26:35 np0005463580.novalocal sshd-session[4981]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13912 ssh2 [preauth]
Sep 30 20:26:35 np0005463580.novalocal sshd-session[4981]: Disconnecting authenticating user root 185.217.1.246 port 13912: Too many authentication failures [preauth]
Sep 30 20:26:45 np0005463580.novalocal sshd-session[5675]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54218 ssh2 [preauth]
Sep 30 20:26:45 np0005463580.novalocal sshd-session[5675]: Disconnecting authenticating user root 185.217.1.246 port 54218: Too many authentication failures [preauth]
Sep 30 20:26:48 np0005463580.novalocal sshd-session[5677]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 4341 ssh2 [preauth]
Sep 30 20:26:48 np0005463580.novalocal sshd-session[5677]: Disconnecting authenticating user root 185.217.1.246 port 4341: Too many authentication failures [preauth]
Sep 30 20:26:52 np0005463580.novalocal dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 20:26:52 np0005463580.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:26:52 np0005463580.novalocal systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:26:52 np0005463580.novalocal systemd[1]: Reloading.
Sep 30 20:26:52 np0005463580.novalocal systemd-rc-local-generator[5722]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:26:52 np0005463580.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:26:53 np0005463580.novalocal sshd-session[5679]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32849 ssh2 [preauth]
Sep 30 20:26:53 np0005463580.novalocal sshd-session[5679]: Disconnecting authenticating user root 185.217.1.246 port 32849: Too many authentication failures [preauth]
Sep 30 20:26:53 np0005463580.novalocal systemd[1]: Starting PackageKit Daemon...
Sep 30 20:26:53 np0005463580.novalocal PackageKit[6444]: daemon start
Sep 30 20:26:53 np0005463580.novalocal systemd[1]: Starting Authorization Manager...
Sep 30 20:26:53 np0005463580.novalocal polkitd[6535]: Started polkitd version 0.117
Sep 30 20:26:53 np0005463580.novalocal polkitd[6535]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 20:26:53 np0005463580.novalocal polkitd[6535]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 20:26:53 np0005463580.novalocal polkitd[6535]: Finished loading, compiling and executing 3 rules
Sep 30 20:26:53 np0005463580.novalocal systemd[1]: Started Authorization Manager.
Sep 30 20:26:53 np0005463580.novalocal polkitd[6535]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Sep 30 20:26:53 np0005463580.novalocal systemd[1]: Started PackageKit Daemon.
Sep 30 20:26:53 np0005463580.novalocal sudo[4874]: pam_unix(sudo:session): session closed for user root
Sep 30 20:26:55 np0005463580.novalocal sshd-session[6755]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8417 ssh2 [preauth]
Sep 30 20:26:55 np0005463580.novalocal sshd-session[6755]: Disconnecting authenticating user root 185.217.1.246 port 8417: Too many authentication failures [preauth]
Sep 30 20:27:07 np0005463580.novalocal python3[12793]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-d4e4-86ab-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:27:08 np0005463580.novalocal sshd-session[8387]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25379 ssh2 [preauth]
Sep 30 20:27:08 np0005463580.novalocal sshd-session[8387]: Disconnecting authenticating user root 185.217.1.246 port 25379: Too many authentication failures [preauth]
Sep 30 20:27:09 np0005463580.novalocal kernel: evm: overlay not supported
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: Starting D-Bus User Message Bus...
Sep 30 20:27:10 np0005463580.novalocal dbus-broker-launch[13509]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Sep 30 20:27:10 np0005463580.novalocal dbus-broker-launch[13509]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: Started D-Bus User Message Bus.
Sep 30 20:27:10 np0005463580.novalocal dbus-broker-lau[13509]: Ready
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: Created slice Slice /user.
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: podman-13258.scope: unit configures an IP firewall, but not running as root.
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: (This warning is only shown for the first unit using IP firewalling.)
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: Started podman-13258.scope.
Sep 30 20:27:10 np0005463580.novalocal systemd[1060]: Started podman-pause-c5804ea8.scope.
Sep 30 20:27:10 np0005463580.novalocal sudo[13762]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbigamwxwyxchyftwtgaofvyzpyyzfkl ; /usr/bin/python3'
Sep 30 20:27:10 np0005463580.novalocal sudo[13762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:10 np0005463580.novalocal python3[13777]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.98:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.98:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:27:11 np0005463580.novalocal sudo[13762]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:11 np0005463580.novalocal sshd-session[4850]: Connection closed by 38.102.83.114 port 47832
Sep 30 20:27:11 np0005463580.novalocal sshd-session[4846]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:27:11 np0005463580.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Sep 30 20:27:11 np0005463580.novalocal systemd[1]: session-5.scope: Consumed 1min 1.641s CPU time.
Sep 30 20:27:11 np0005463580.novalocal systemd-logind[792]: Session 5 logged out. Waiting for processes to exit.
Sep 30 20:27:11 np0005463580.novalocal systemd-logind[792]: Removed session 5.
Sep 30 20:27:15 np0005463580.novalocal sshd-session[14077]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26052 ssh2 [preauth]
Sep 30 20:27:15 np0005463580.novalocal sshd-session[14077]: Disconnecting authenticating user root 185.217.1.246 port 26052: Too many authentication failures [preauth]
Sep 30 20:27:21 np0005463580.novalocal irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Sep 30 20:27:21 np0005463580.novalocal irqbalance[784]: IRQ 27 affinity is now unmanaged
Sep 30 20:27:24 np0005463580.novalocal sshd-session[15063]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 56828 ssh2 [preauth]
Sep 30 20:27:24 np0005463580.novalocal sshd-session[15063]: Disconnecting authenticating user root 185.217.1.246 port 56828: Too many authentication failures [preauth]
Sep 30 20:27:29 np0005463580.novalocal sshd-session[17826]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6648 ssh2 [preauth]
Sep 30 20:27:29 np0005463580.novalocal sshd-session[17826]: Disconnecting authenticating user root 185.217.1.246 port 6648: Too many authentication failures [preauth]
Sep 30 20:27:33 np0005463580.novalocal sshd-session[20647]: Connection closed by 38.102.83.65 port 58832 [preauth]
Sep 30 20:27:33 np0005463580.novalocal sshd-session[20649]: Connection closed by 38.102.83.65 port 58834 [preauth]
Sep 30 20:27:33 np0005463580.novalocal sshd-session[20648]: Unable to negotiate with 38.102.83.65 port 58846: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 20:27:33 np0005463580.novalocal sshd-session[20650]: Unable to negotiate with 38.102.83.65 port 58858: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 20:27:33 np0005463580.novalocal sshd-session[20651]: Unable to negotiate with 38.102.83.65 port 58874: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 20:27:36 np0005463580.novalocal sshd-session[20348]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6500 ssh2 [preauth]
Sep 30 20:27:36 np0005463580.novalocal sshd-session[20348]: Disconnecting authenticating user root 185.217.1.246 port 6500: Too many authentication failures [preauth]
Sep 30 20:27:38 np0005463580.novalocal sshd-session[22603]: Accepted publickey for zuul from 38.102.83.114 port 44918 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:27:38 np0005463580.novalocal systemd-logind[792]: New session 6 of user zuul.
Sep 30 20:27:38 np0005463580.novalocal systemd[1]: Started Session 6 of User zuul.
Sep 30 20:27:38 np0005463580.novalocal sshd-session[22603]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:27:39 np0005463580.novalocal python3[22694]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:39 np0005463580.novalocal sudo[22828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blejtrykctajsmoutbiknnbuegfaoysw ; /usr/bin/python3'
Sep 30 20:27:39 np0005463580.novalocal sudo[22828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:39 np0005463580.novalocal python3[22837]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:39 np0005463580.novalocal sudo[22828]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:40 np0005463580.novalocal sudo[23171]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpfwggondmnfufvkdayhwltscfsiqmwk ; /usr/bin/python3'
Sep 30 20:27:40 np0005463580.novalocal sudo[23171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:40 np0005463580.novalocal python3[23177]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005463580.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Sep 30 20:27:40 np0005463580.novalocal useradd[23251]: new group: name=cloud-admin, GID=1002
Sep 30 20:27:40 np0005463580.novalocal useradd[23251]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Sep 30 20:27:40 np0005463580.novalocal sudo[23171]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:40 np0005463580.novalocal sudo[23393]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqedttdafliatomlytdxrjdlyrztgtoh ; /usr/bin/python3'
Sep 30 20:27:40 np0005463580.novalocal sudo[23393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:41 np0005463580.novalocal python3[23401]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKP9FKOWW9m598mLycZPRqlsVgMJVsU+fwfdzM9eBK+TwOcguPrf/EtkmgDfbYrAdUCisCCNyF1sODVm7Os50jE= zuul@np0005463579.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 20:27:41 np0005463580.novalocal sudo[23393]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:41 np0005463580.novalocal sshd-session[22707]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 63811 ssh2 [preauth]
Sep 30 20:27:41 np0005463580.novalocal sshd-session[22707]: Disconnecting authenticating user root 185.217.1.246 port 63811: Too many authentication failures [preauth]
Sep 30 20:27:41 np0005463580.novalocal sudo[23682]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uronfcmqrcrwyekwnbhxzwpghcpkbonq ; /usr/bin/python3'
Sep 30 20:27:41 np0005463580.novalocal sudo[23682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:41 np0005463580.novalocal python3[23691]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:27:41 np0005463580.novalocal sudo[23682]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:41 np0005463580.novalocal sudo[23966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lohtyvhgcogvgboiujyxltfvitrixzxx ; /usr/bin/python3'
Sep 30 20:27:41 np0005463580.novalocal sudo[23966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:42 np0005463580.novalocal python3[23975]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759264061.267961-167-249013684448302/source _original_basename=tmpiint1ptn follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:27:42 np0005463580.novalocal sudo[23966]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:42 np0005463580.novalocal sudo[24391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgdadqidrhcjkfdqvshtdfbrehemueu ; /usr/bin/python3'
Sep 30 20:27:42 np0005463580.novalocal sudo[24391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:27:43 np0005463580.novalocal python3[24401]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Sep 30 20:27:43 np0005463580.novalocal systemd[1]: Starting Hostname Service...
Sep 30 20:27:43 np0005463580.novalocal systemd[1]: Started Hostname Service.
Sep 30 20:27:43 np0005463580.novalocal systemd-hostnamed[24503]: Changed pretty hostname to 'compute-0'
Sep 30 20:27:43 compute-0 systemd-hostnamed[24503]: Hostname set to <compute-0> (static)
Sep 30 20:27:43 compute-0 NetworkManager[3942]: <info>  [1759264063.1656] hostname: static hostname changed from "np0005463580.novalocal" to "compute-0"
Sep 30 20:27:43 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:27:43 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:27:43 compute-0 sudo[24391]: pam_unix(sudo:session): session closed for user root
Sep 30 20:27:43 compute-0 sshd-session[22641]: Connection closed by 38.102.83.114 port 44918
Sep 30 20:27:43 compute-0 sshd-session[22603]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:27:43 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Sep 30 20:27:43 compute-0 systemd[1]: session-6.scope: Consumed 2.217s CPU time.
Sep 30 20:27:43 compute-0 systemd-logind[792]: Session 6 logged out. Waiting for processes to exit.
Sep 30 20:27:43 compute-0 systemd-logind[792]: Removed session 6.
Sep 30 20:27:49 compute-0 sshd-session[24055]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26266 ssh2 [preauth]
Sep 30 20:27:49 compute-0 sshd-session[24055]: Disconnecting authenticating user root 185.217.1.246 port 26266: Too many authentication failures [preauth]
Sep 30 20:27:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:27:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:27:49 compute-0 systemd[1]: man-db-cache-update.service: Consumed 55.755s CPU time.
Sep 30 20:27:49 compute-0 systemd[1]: run-r1aa67c8e308f436cb4398a68f61b6795.service: Deactivated successfully.
Sep 30 20:27:53 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:27:55 compute-0 sshd-session[26764]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 34665 ssh2 [preauth]
Sep 30 20:27:55 compute-0 sshd-session[26764]: Disconnecting authenticating user root 185.217.1.246 port 34665: Too many authentication failures [preauth]
Sep 30 20:28:04 compute-0 sshd-session[26766]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14224 ssh2 [preauth]
Sep 30 20:28:04 compute-0 sshd-session[26766]: Disconnecting authenticating user root 185.217.1.246 port 14224: Too many authentication failures [preauth]
Sep 30 20:28:09 compute-0 sshd-session[26768]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25586 ssh2 [preauth]
Sep 30 20:28:09 compute-0 sshd-session[26768]: Disconnecting authenticating user root 185.217.1.246 port 25586: Too many authentication failures [preauth]
Sep 30 20:28:13 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:28:16 compute-0 sshd-session[26770]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13714 ssh2 [preauth]
Sep 30 20:28:16 compute-0 sshd-session[26770]: Disconnecting authenticating user root 185.217.1.246 port 13714: Too many authentication failures [preauth]
Sep 30 20:28:22 compute-0 sshd-session[26775]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8687 ssh2 [preauth]
Sep 30 20:28:22 compute-0 sshd-session[26775]: Disconnecting authenticating user root 185.217.1.246 port 8687: Too many authentication failures [preauth]
Sep 30 20:28:27 compute-0 sshd-session[26777]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 49671 ssh2 [preauth]
Sep 30 20:28:27 compute-0 sshd-session[26777]: Disconnecting authenticating user root 185.217.1.246 port 49671: Too many authentication failures [preauth]
Sep 30 20:28:35 compute-0 sshd-session[26779]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 33152 ssh2 [preauth]
Sep 30 20:28:35 compute-0 sshd-session[26779]: Disconnecting authenticating user root 185.217.1.246 port 33152: Too many authentication failures [preauth]
Sep 30 20:28:38 compute-0 sshd-session[26781]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26136 ssh2 [preauth]
Sep 30 20:28:38 compute-0 sshd-session[26781]: Disconnecting authenticating user root 185.217.1.246 port 26136: Too many authentication failures [preauth]
Sep 30 20:28:47 compute-0 sshd-session[26783]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 16652 ssh2 [preauth]
Sep 30 20:28:47 compute-0 sshd-session[26783]: Disconnecting authenticating user root 185.217.1.246 port 16652: Too many authentication failures [preauth]
Sep 30 20:28:54 compute-0 sshd-session[26785]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 62732 ssh2 [preauth]
Sep 30 20:28:54 compute-0 sshd-session[26785]: Disconnecting authenticating user root 185.217.1.246 port 62732: Too many authentication failures [preauth]
Sep 30 20:28:58 compute-0 sshd-session[26787]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 3512 ssh2 [preauth]
Sep 30 20:28:58 compute-0 sshd-session[26787]: Disconnecting authenticating user root 185.217.1.246 port 3512: Too many authentication failures [preauth]
Sep 30 20:29:01 compute-0 systemd[1]: Starting dnf makecache...
Sep 30 20:29:01 compute-0 dnf[26791]: Failed determining last makecache time.
Sep 30 20:29:01 compute-0 dnf[26791]: CentOS Stream 9 - BaseOS                         45 kB/s | 7.0 kB     00:00
Sep 30 20:29:01 compute-0 dnf[26791]: CentOS Stream 9 - BaseOS                         13 kB/s | 3.9 kB     00:00
Sep 30 20:29:01 compute-0 dnf[26791]: Errors during downloading metadata for repository 'baseos':
Sep 30 20:29:01 compute-0 dnf[26791]:   - Downloading successful, but checksum doesn't match. Calculated: c33587e16099063711748728ddc86ab9a79cc317439cb6505b08596cab68833db56125e364221ebf02cbd6e811e95a16b6342660429b1420843c6481886c67df(sha512)  Expected: 560fcf5558314ba5fd3ab618810735643afd0e6cd44e2ddea354a4e6343cfa9a67c773dae3000fe6d96f56eb046582e723eeae98dc6121fc4692400eee660278(sha512)
Sep 30 20:29:01 compute-0 dnf[26791]: Error: Failed to download metadata for repo 'baseos': Cannot download repomd.xml: Downloading successful, but checksum doesn't match. Calculated: c33587e16099063711748728ddc86ab9a79cc317439cb6505b08596cab68833db56125e364221ebf02cbd6e811e95a16b6342660429b1420843c6481886c67df(sha512)  Expected: 560fcf5558314ba5fd3ab618810735643afd0e6cd44e2ddea354a4e6343cfa9a67c773dae3000fe6d96f56eb046582e723eeae98dc6121fc4692400eee660278(sha512)
Sep 30 20:29:02 compute-0 systemd[1]: dnf-makecache.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 20:29:02 compute-0 systemd[1]: dnf-makecache.service: Failed with result 'exit-code'.
Sep 30 20:29:02 compute-0 systemd[1]: Failed to start dnf makecache.
Sep 30 20:29:02 compute-0 sshd-session[26789]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 29122 ssh2 [preauth]
Sep 30 20:29:02 compute-0 sshd-session[26789]: Disconnecting authenticating user root 185.217.1.246 port 29122: Too many authentication failures [preauth]
Sep 30 20:29:11 compute-0 sshd-session[26796]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 10219 ssh2 [preauth]
Sep 30 20:29:11 compute-0 sshd-session[26796]: Disconnecting authenticating user root 185.217.1.246 port 10219: Too many authentication failures [preauth]
Sep 30 20:29:19 compute-0 sshd-session[26799]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 24331 ssh2 [preauth]
Sep 30 20:29:19 compute-0 sshd-session[26799]: Disconnecting authenticating user root 185.217.1.246 port 24331: Too many authentication failures [preauth]
Sep 30 20:29:24 compute-0 sshd-session[26802]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 34668 ssh2 [preauth]
Sep 30 20:29:24 compute-0 sshd-session[26802]: Disconnecting authenticating user root 185.217.1.246 port 34668: Too many authentication failures [preauth]
Sep 30 20:29:27 compute-0 sshd-session[26804]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58517 ssh2 [preauth]
Sep 30 20:29:27 compute-0 sshd-session[26804]: Disconnecting authenticating user root 185.217.1.246 port 58517: Too many authentication failures [preauth]
Sep 30 20:29:32 compute-0 sshd-session[26806]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 14583 ssh2 [preauth]
Sep 30 20:29:32 compute-0 sshd-session[26806]: Disconnecting authenticating user root 185.217.1.246 port 14583: Too many authentication failures [preauth]
Sep 30 20:29:44 compute-0 sshd-session[26808]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20598 ssh2 [preauth]
Sep 30 20:29:44 compute-0 sshd-session[26808]: Disconnecting authenticating user root 185.217.1.246 port 20598: Too many authentication failures [preauth]
Sep 30 20:29:48 compute-0 sshd-session[26810]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32945 ssh2 [preauth]
Sep 30 20:29:48 compute-0 sshd-session[26810]: Disconnecting authenticating user root 185.217.1.246 port 32945: Too many authentication failures [preauth]
Sep 30 20:29:53 compute-0 sshd-session[26812]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 6093 ssh2 [preauth]
Sep 30 20:29:53 compute-0 sshd-session[26812]: Disconnecting authenticating user root 185.217.1.246 port 6093: Too many authentication failures [preauth]
Sep 30 20:29:57 compute-0 sshd-session[26814]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 45474 ssh2 [preauth]
Sep 30 20:29:57 compute-0 sshd-session[26814]: Disconnecting authenticating user root 185.217.1.246 port 45474: Too many authentication failures [preauth]
Sep 30 20:30:03 compute-0 sshd-session[26816]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8857 ssh2 [preauth]
Sep 30 20:30:03 compute-0 sshd-session[26816]: Disconnecting authenticating user root 185.217.1.246 port 8857: Too many authentication failures [preauth]
Sep 30 20:30:08 compute-0 sshd-session[26818]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 53429 ssh2 [preauth]
Sep 30 20:30:08 compute-0 sshd-session[26818]: Disconnecting authenticating user root 185.217.1.246 port 53429: Too many authentication failures [preauth]
Sep 30 20:30:11 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Sep 30 20:30:11 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Sep 30 20:30:11 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Sep 30 20:30:11 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Sep 30 20:30:17 compute-0 sshd-session[26824]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54867 ssh2 [preauth]
Sep 30 20:30:17 compute-0 sshd-session[26824]: Disconnecting authenticating user root 185.217.1.246 port 54867: Too many authentication failures [preauth]
Sep 30 20:30:24 compute-0 sshd-session[26826]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 52938 ssh2 [preauth]
Sep 30 20:30:24 compute-0 sshd-session[26826]: Disconnecting authenticating user root 185.217.1.246 port 52938: Too many authentication failures [preauth]
Sep 30 20:30:32 compute-0 sshd-session[26828]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 40164 ssh2 [preauth]
Sep 30 20:30:32 compute-0 sshd-session[26828]: Disconnecting authenticating user root 185.217.1.246 port 40164: Too many authentication failures [preauth]
Sep 30 20:30:40 compute-0 sshd-session[26830]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 16497 ssh2 [preauth]
Sep 30 20:30:40 compute-0 sshd-session[26830]: Disconnecting authenticating user root 185.217.1.246 port 16497: Too many authentication failures [preauth]
Sep 30 20:30:45 compute-0 sshd-session[26832]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 49288 ssh2 [preauth]
Sep 30 20:30:45 compute-0 sshd-session[26832]: Disconnecting authenticating user root 185.217.1.246 port 49288: Too many authentication failures [preauth]
Sep 30 20:30:49 compute-0 sshd-session[26834]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 24472 ssh2 [preauth]
Sep 30 20:30:49 compute-0 sshd-session[26834]: Disconnecting authenticating user root 185.217.1.246 port 24472: Too many authentication failures [preauth]
Sep 30 20:30:51 compute-0 sshd-session[26836]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 52339 ssh2 [preauth]
Sep 30 20:30:51 compute-0 sshd-session[26836]: Disconnecting authenticating user root 185.217.1.246 port 52339: Too many authentication failures [preauth]
Sep 30 20:30:58 compute-0 sshd-session[26838]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 5850 ssh2 [preauth]
Sep 30 20:30:58 compute-0 sshd-session[26838]: Disconnecting authenticating user root 185.217.1.246 port 5850: Too many authentication failures [preauth]
Sep 30 20:31:08 compute-0 sshd-session[26840]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 4981 ssh2 [preauth]
Sep 30 20:31:08 compute-0 sshd-session[26840]: Disconnecting authenticating user root 185.217.1.246 port 4981: Too many authentication failures [preauth]
Sep 30 20:31:11 compute-0 sshd-session[26842]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 24443 ssh2 [preauth]
Sep 30 20:31:11 compute-0 sshd-session[26842]: Disconnecting authenticating user root 185.217.1.246 port 24443: Too many authentication failures [preauth]
Sep 30 20:31:19 compute-0 sshd-session[26844]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 52487 ssh2 [preauth]
Sep 30 20:31:19 compute-0 sshd-session[26844]: Disconnecting authenticating user root 185.217.1.246 port 52487: Too many authentication failures [preauth]
Sep 30 20:31:21 compute-0 sshd-session[26846]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 44165 ssh2 [preauth]
Sep 30 20:31:21 compute-0 sshd-session[26846]: Disconnecting authenticating user root 185.217.1.246 port 44165: Too many authentication failures [preauth]
Sep 30 20:31:25 compute-0 sshd-session[26848]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 59290 ssh2 [preauth]
Sep 30 20:31:25 compute-0 sshd-session[26848]: Disconnecting authenticating user root 185.217.1.246 port 59290: Too many authentication failures [preauth]
Sep 30 20:31:29 compute-0 sshd-session[26850]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 33210 ssh2 [preauth]
Sep 30 20:31:29 compute-0 sshd-session[26850]: Disconnecting authenticating user root 185.217.1.246 port 33210: Too many authentication failures [preauth]
Sep 30 20:31:37 compute-0 sshd-session[26852]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 64587 ssh2 [preauth]
Sep 30 20:31:37 compute-0 sshd-session[26852]: Disconnecting authenticating user root 185.217.1.246 port 64587: Too many authentication failures [preauth]
Sep 30 20:31:42 compute-0 sshd-session[26855]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 59929 ssh2 [preauth]
Sep 30 20:31:42 compute-0 sshd-session[26855]: Disconnecting authenticating user root 185.217.1.246 port 59929: Too many authentication failures [preauth]
Sep 30 20:31:47 compute-0 sshd-session[26857]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 43631 ssh2 [preauth]
Sep 30 20:31:47 compute-0 sshd-session[26857]: Disconnecting authenticating user root 185.217.1.246 port 43631: Too many authentication failures [preauth]
Sep 30 20:31:54 compute-0 sshd-session[26859]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12116 ssh2 [preauth]
Sep 30 20:31:54 compute-0 sshd-session[26859]: Disconnecting authenticating user root 185.217.1.246 port 12116: Too many authentication failures [preauth]
Sep 30 20:31:56 compute-0 sshd-session[26863]: Accepted publickey for zuul from 38.102.83.65 port 46220 ssh2: RSA SHA256:N3BSvNcfUiE1OsFBeXsHWduICOCfoShxma1BAooRE2o
Sep 30 20:31:56 compute-0 systemd-logind[792]: New session 7 of user zuul.
Sep 30 20:31:56 compute-0 systemd[1]: Started Session 7 of User zuul.
Sep 30 20:31:56 compute-0 sshd-session[26863]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:31:57 compute-0 python3[26939]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:31:57 compute-0 sshd-session[26861]: Connection closed by authenticating user root 80.94.95.115 port 42652 [preauth]
Sep 30 20:31:59 compute-0 PackageKit[6444]: daemon quit
Sep 30 20:31:59 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 20:31:59 compute-0 sudo[27055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiitpvecarwcuqreqyyliighaikexgkc ; /usr/bin/python3'
Sep 30 20:31:59 compute-0 sudo[27055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:31:59 compute-0 python3[27057]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:31:59 compute-0 sudo[27055]: pam_unix(sudo:session): session closed for user root
Sep 30 20:31:59 compute-0 sudo[27129]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayrowylqibkrwawvqqqriphpdhdshjgu ; /usr/bin/python3'
Sep 30 20:31:59 compute-0 sudo[27129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:31:59 compute-0 python3[27131]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=delorean.repo follow=False checksum=fdbc451c7e16efca2444f90fdb72f8eb1c12a1b5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:31:59 compute-0 sudo[27129]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-0 sudo[27155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgfhorvgmaainfewniljexdmofhfoolf ; /usr/bin/python3'
Sep 30 20:32:00 compute-0 sudo[27155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-0 python3[27157]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:00 compute-0 sudo[27155]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-0 sudo[27228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxlicwiawibhpgagcyyotlwrswhjdjlm ; /usr/bin/python3'
Sep 30 20:32:00 compute-0 sudo[27228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-0 python3[27230]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:00 compute-0 sudo[27228]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:00 compute-0 sudo[27254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygjysebbwcljscerktghcezlkfzlhrfb ; /usr/bin/python3'
Sep 30 20:32:00 compute-0 sudo[27254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:00 compute-0 python3[27256]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:00 compute-0 sudo[27254]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-0 sudo[27327]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwzqgmrfhcvmxmdtzckmkfsbfcncvtum ; /usr/bin/python3'
Sep 30 20:32:01 compute-0 sudo[27327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-0 python3[27329]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:01 compute-0 sudo[27327]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-0 sudo[27353]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjwzxhewwrydhwbdsfdgnvfiundysgnr ; /usr/bin/python3'
Sep 30 20:32:01 compute-0 sudo[27353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-0 python3[27355]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:01 compute-0 sudo[27353]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:01 compute-0 sudo[27426]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktmkyxbollsbjmwrrwpveaekpzovjypf ; /usr/bin/python3'
Sep 30 20:32:01 compute-0 sudo[27426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:01 compute-0 python3[27428]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:01 compute-0 sudo[27426]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-0 sudo[27452]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgswcxgkoabtgsppffwidcpwcxgctpvc ; /usr/bin/python3'
Sep 30 20:32:02 compute-0 sudo[27452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-0 python3[27454]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:02 compute-0 sudo[27452]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-0 sudo[27525]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neegqtidgwwqbztvpkgdmtrmeqpvbsol ; /usr/bin/python3'
Sep 30 20:32:02 compute-0 sudo[27525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-0 python3[27527]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:02 compute-0 sudo[27525]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:02 compute-0 sudo[27551]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ficrasmvywjydurvdbskzxkwlnomargu ; /usr/bin/python3'
Sep 30 20:32:02 compute-0 sudo[27551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:02 compute-0 python3[27553]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:02 compute-0 sudo[27551]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-0 sudo[27624]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-libeyamlcesiudnxkyznsviuwgnasyro ; /usr/bin/python3'
Sep 30 20:32:03 compute-0 sudo[27624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-0 python3[27626]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:03 compute-0 sudo[27624]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-0 sshd-session[26978]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 42691 ssh2 [preauth]
Sep 30 20:32:03 compute-0 sshd-session[26978]: Disconnecting authenticating user root 185.217.1.246 port 42691: Too many authentication failures [preauth]
Sep 30 20:32:03 compute-0 sudo[27650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peamxzsnpvyqbnskkodrvwuszttlhoij ; /usr/bin/python3'
Sep 30 20:32:03 compute-0 sudo[27650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-0 python3[27652]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 20:32:03 compute-0 sudo[27650]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:03 compute-0 sudo[27724]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oitcbqqhsfaslvheblzomadftxisqjnu ; /usr/bin/python3'
Sep 30 20:32:03 compute-0 sudo[27724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:32:03 compute-0 python3[27726]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759264319.2399125-30949-153729302939461/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=3193b2329e025492c2ae01f1388d5694c4facea6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:32:03 compute-0 sudo[27724]: pam_unix(sudo:session): session closed for user root
Sep 30 20:32:06 compute-0 sshd-session[27752]: Connection closed by 192.168.122.11 port 37952 [preauth]
Sep 30 20:32:06 compute-0 sshd-session[27753]: Connection closed by 192.168.122.11 port 37968 [preauth]
Sep 30 20:32:06 compute-0 sshd-session[27754]: Unable to negotiate with 192.168.122.11 port 37972: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 20:32:06 compute-0 sshd-session[27756]: Unable to negotiate with 192.168.122.11 port 37974: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 20:32:06 compute-0 sshd-session[27757]: Unable to negotiate with 192.168.122.11 port 37990: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 20:32:08 compute-0 sshd-session[27673]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13185 ssh2 [preauth]
Sep 30 20:32:08 compute-0 sshd-session[27673]: Disconnecting authenticating user root 185.217.1.246 port 13185: Too many authentication failures [preauth]
Sep 30 20:32:11 compute-0 sshd-session[27762]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58228 ssh2 [preauth]
Sep 30 20:32:11 compute-0 sshd-session[27762]: Disconnecting authenticating user root 185.217.1.246 port 58228: Too many authentication failures [preauth]
Sep 30 20:32:12 compute-0 python3[27787]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:32:18 compute-0 sshd-session[27789]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 23288 ssh2 [preauth]
Sep 30 20:32:18 compute-0 sshd-session[27789]: Disconnecting authenticating user root 185.217.1.246 port 23288: Too many authentication failures [preauth]
Sep 30 20:32:24 compute-0 sshd-session[27791]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 11990 ssh2 [preauth]
Sep 30 20:32:24 compute-0 sshd-session[27791]: Disconnecting authenticating user root 185.217.1.246 port 11990: Too many authentication failures [preauth]
Sep 30 20:32:38 compute-0 sshd-session[27793]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15457 ssh2 [preauth]
Sep 30 20:32:38 compute-0 sshd-session[27793]: Disconnecting authenticating user root 185.217.1.246 port 15457: Too many authentication failures [preauth]
Sep 30 20:32:43 compute-0 sshd-session[27795]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 58541 ssh2 [preauth]
Sep 30 20:32:43 compute-0 sshd-session[27795]: Disconnecting authenticating user root 185.217.1.246 port 58541: Too many authentication failures [preauth]
Sep 30 20:32:49 compute-0 sshd-session[27797]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 21362 ssh2 [preauth]
Sep 30 20:32:49 compute-0 sshd-session[27797]: Disconnecting authenticating user root 185.217.1.246 port 21362: Too many authentication failures [preauth]
Sep 30 20:33:01 compute-0 sshd-session[27799]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 36020 ssh2 [preauth]
Sep 30 20:33:01 compute-0 sshd-session[27799]: Disconnecting authenticating user root 185.217.1.246 port 36020: Too many authentication failures [preauth]
Sep 30 20:33:05 compute-0 sshd-session[27801]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 52007 ssh2 [preauth]
Sep 30 20:33:05 compute-0 sshd-session[27801]: Disconnecting authenticating user root 185.217.1.246 port 52007: Too many authentication failures [preauth]
Sep 30 20:33:11 compute-0 sshd-session[27803]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 13020 ssh2 [preauth]
Sep 30 20:33:11 compute-0 sshd-session[27803]: Disconnecting authenticating user root 185.217.1.246 port 13020: Too many authentication failures [preauth]
Sep 30 20:33:16 compute-0 sshd-session[27805]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 3384 ssh2 [preauth]
Sep 30 20:33:16 compute-0 sshd-session[27805]: Disconnecting authenticating user root 185.217.1.246 port 3384: Too many authentication failures [preauth]
Sep 30 20:33:24 compute-0 sshd-session[27807]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 56916 ssh2 [preauth]
Sep 30 20:33:24 compute-0 sshd-session[27807]: Disconnecting authenticating user root 185.217.1.246 port 56916: Too many authentication failures [preauth]
Sep 30 20:33:28 compute-0 sshd-session[27809]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 38136 ssh2 [preauth]
Sep 30 20:33:28 compute-0 sshd-session[27809]: Disconnecting authenticating user root 185.217.1.246 port 38136: Too many authentication failures [preauth]
Sep 30 20:33:33 compute-0 sshd-session[27811]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12992 ssh2 [preauth]
Sep 30 20:33:33 compute-0 sshd-session[27811]: Disconnecting authenticating user root 185.217.1.246 port 12992: Too many authentication failures [preauth]
Sep 30 20:33:36 compute-0 sshd-session[27813]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 55182 ssh2 [preauth]
Sep 30 20:33:36 compute-0 sshd-session[27813]: Disconnecting authenticating user root 185.217.1.246 port 55182: Too many authentication failures [preauth]
Sep 30 20:33:43 compute-0 sshd-session[27815]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12318 ssh2 [preauth]
Sep 30 20:33:43 compute-0 sshd-session[27815]: Disconnecting authenticating user root 185.217.1.246 port 12318: Too many authentication failures [preauth]
Sep 30 20:33:51 compute-0 sshd-session[27817]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 7987 ssh2 [preauth]
Sep 30 20:33:51 compute-0 sshd-session[27817]: Disconnecting authenticating user root 185.217.1.246 port 7987: Too many authentication failures [preauth]
Sep 30 20:33:56 compute-0 sshd-session[27819]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 20954 ssh2 [preauth]
Sep 30 20:33:56 compute-0 sshd-session[27819]: Disconnecting authenticating user root 185.217.1.246 port 20954: Too many authentication failures [preauth]
Sep 30 20:34:00 compute-0 sshd-session[27821]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 42490 ssh2 [preauth]
Sep 30 20:34:00 compute-0 sshd-session[27821]: Disconnecting authenticating user root 185.217.1.246 port 42490: Too many authentication failures [preauth]
Sep 30 20:34:06 compute-0 sshd-session[27823]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 19509 ssh2 [preauth]
Sep 30 20:34:06 compute-0 sshd-session[27823]: Disconnecting authenticating user root 185.217.1.246 port 19509: Too many authentication failures [preauth]
Sep 30 20:34:14 compute-0 sshd-session[27825]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 57263 ssh2 [preauth]
Sep 30 20:34:14 compute-0 sshd-session[27825]: Disconnecting authenticating user root 185.217.1.246 port 57263: Too many authentication failures [preauth]
Sep 30 20:34:25 compute-0 sshd-session[27827]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26881 ssh2 [preauth]
Sep 30 20:34:25 compute-0 sshd-session[27827]: Disconnecting authenticating user root 185.217.1.246 port 26881: Too many authentication failures [preauth]
Sep 30 20:34:27 compute-0 sshd-session[27829]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26070 ssh2 [preauth]
Sep 30 20:34:27 compute-0 sshd-session[27829]: Disconnecting authenticating user root 185.217.1.246 port 26070: Too many authentication failures [preauth]
Sep 30 20:34:35 compute-0 sshd-session[27831]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 45956 ssh2 [preauth]
Sep 30 20:34:35 compute-0 sshd-session[27831]: Disconnecting authenticating user root 185.217.1.246 port 45956: Too many authentication failures [preauth]
Sep 30 20:34:40 compute-0 sshd-session[27833]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 5207 ssh2 [preauth]
Sep 30 20:34:40 compute-0 sshd-session[27833]: Disconnecting authenticating user root 185.217.1.246 port 5207: Too many authentication failures [preauth]
Sep 30 20:34:44 compute-0 sshd-session[27835]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 24796 ssh2 [preauth]
Sep 30 20:34:44 compute-0 sshd-session[27835]: Disconnecting authenticating user root 185.217.1.246 port 24796: Too many authentication failures [preauth]
Sep 30 20:34:50 compute-0 sshd-session[27837]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 2875 ssh2 [preauth]
Sep 30 20:34:50 compute-0 sshd-session[27837]: Disconnecting authenticating user root 185.217.1.246 port 2875: Too many authentication failures [preauth]
Sep 30 20:35:01 compute-0 sshd-session[27839]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54864 ssh2 [preauth]
Sep 30 20:35:01 compute-0 sshd-session[27839]: Disconnecting authenticating user root 185.217.1.246 port 54864: Too many authentication failures [preauth]
Sep 30 20:35:04 compute-0 sshd-session[27841]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22688 ssh2 [preauth]
Sep 30 20:35:04 compute-0 sshd-session[27841]: Disconnecting authenticating user root 185.217.1.246 port 22688: Too many authentication failures [preauth]
Sep 30 20:35:06 compute-0 sshd-session[27843]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 44613 ssh2 [preauth]
Sep 30 20:35:06 compute-0 sshd-session[27843]: Disconnecting authenticating user root 185.217.1.246 port 44613: Too many authentication failures [preauth]
Sep 30 20:35:11 compute-0 sshd-session[27845]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 3228 ssh2 [preauth]
Sep 30 20:35:11 compute-0 sshd-session[27845]: Disconnecting authenticating user root 185.217.1.246 port 3228: Too many authentication failures [preauth]
Sep 30 20:35:19 compute-0 sshd-session[27848]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 53943 ssh2 [preauth]
Sep 30 20:35:19 compute-0 sshd-session[27848]: Disconnecting authenticating user root 185.217.1.246 port 53943: Too many authentication failures [preauth]
Sep 30 20:35:27 compute-0 sshd-session[27850]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 7733 ssh2 [preauth]
Sep 30 20:35:27 compute-0 sshd-session[27850]: Disconnecting authenticating user root 185.217.1.246 port 7733: Too many authentication failures [preauth]
Sep 30 20:35:35 compute-0 sshd-session[27852]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 47657 ssh2 [preauth]
Sep 30 20:35:35 compute-0 sshd-session[27852]: Disconnecting authenticating user root 185.217.1.246 port 47657: Too many authentication failures [preauth]
Sep 30 20:35:40 compute-0 sshd-session[27854]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 51828 ssh2 [preauth]
Sep 30 20:35:40 compute-0 sshd-session[27854]: Disconnecting authenticating user root 185.217.1.246 port 51828: Too many authentication failures [preauth]
Sep 30 20:35:46 compute-0 sshd-session[27856]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 31493 ssh2 [preauth]
Sep 30 20:35:46 compute-0 sshd-session[27856]: Disconnecting authenticating user root 185.217.1.246 port 31493: Too many authentication failures [preauth]
Sep 30 20:35:51 compute-0 sshd-session[27858]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 9726 ssh2 [preauth]
Sep 30 20:35:51 compute-0 sshd-session[27858]: Disconnecting authenticating user root 185.217.1.246 port 9726: Too many authentication failures [preauth]
Sep 30 20:35:59 compute-0 sshd-session[27860]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 54513 ssh2 [preauth]
Sep 30 20:35:59 compute-0 sshd-session[27860]: Disconnecting authenticating user root 185.217.1.246 port 54513: Too many authentication failures [preauth]
Sep 30 20:36:04 compute-0 sshd-session[27862]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 5180 ssh2 [preauth]
Sep 30 20:36:04 compute-0 sshd-session[27862]: Disconnecting authenticating user root 185.217.1.246 port 5180: Too many authentication failures [preauth]
Sep 30 20:36:15 compute-0 sshd-session[27864]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 38775 ssh2 [preauth]
Sep 30 20:36:15 compute-0 sshd-session[27864]: Disconnecting authenticating user root 185.217.1.246 port 38775: Too many authentication failures [preauth]
Sep 30 20:36:23 compute-0 sshd-session[27866]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 8697 ssh2 [preauth]
Sep 30 20:36:23 compute-0 sshd-session[27866]: Disconnecting authenticating user root 185.217.1.246 port 8697: Too many authentication failures [preauth]
Sep 30 20:36:34 compute-0 sshd-session[27868]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15779 ssh2 [preauth]
Sep 30 20:36:34 compute-0 sshd-session[27868]: Disconnecting authenticating user root 185.217.1.246 port 15779: Too many authentication failures [preauth]
Sep 30 20:36:38 compute-0 sshd-session[27870]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 22382 ssh2 [preauth]
Sep 30 20:36:38 compute-0 sshd-session[27870]: Disconnecting authenticating user root 185.217.1.246 port 22382: Too many authentication failures [preauth]
Sep 30 20:36:44 compute-0 sshd-session[27872]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 62918 ssh2 [preauth]
Sep 30 20:36:44 compute-0 sshd-session[27872]: Disconnecting authenticating user root 185.217.1.246 port 62918: Too many authentication failures [preauth]
Sep 30 20:36:48 compute-0 sshd-session[27874]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 44411 ssh2 [preauth]
Sep 30 20:36:48 compute-0 sshd-session[27874]: Disconnecting authenticating user root 185.217.1.246 port 44411: Too many authentication failures [preauth]
Sep 30 20:36:51 compute-0 sshd-session[27876]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 9050 ssh2 [preauth]
Sep 30 20:36:51 compute-0 sshd-session[27876]: Disconnecting authenticating user root 185.217.1.246 port 9050: Too many authentication failures [preauth]
Sep 30 20:37:01 compute-0 sshd-session[27878]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 37927 ssh2 [preauth]
Sep 30 20:37:01 compute-0 sshd-session[27878]: Disconnecting authenticating user root 185.217.1.246 port 37927: Too many authentication failures [preauth]
Sep 30 20:37:07 compute-0 sshd-session[27881]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 56392 ssh2 [preauth]
Sep 30 20:37:07 compute-0 sshd-session[27881]: Disconnecting authenticating user root 185.217.1.246 port 56392: Too many authentication failures [preauth]
Sep 30 20:37:11 compute-0 sshd-session[26866]: Received disconnect from 38.102.83.65 port 46220:11: disconnected by user
Sep 30 20:37:11 compute-0 sshd-session[26866]: Disconnected from user zuul 38.102.83.65 port 46220
Sep 30 20:37:11 compute-0 sshd-session[26863]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:37:11 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Sep 30 20:37:11 compute-0 systemd[1]: session-7.scope: Consumed 4.979s CPU time.
Sep 30 20:37:11 compute-0 systemd-logind[792]: Session 7 logged out. Waiting for processes to exit.
Sep 30 20:37:11 compute-0 systemd-logind[792]: Removed session 7.
Sep 30 20:37:12 compute-0 sshd-session[27883]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 46247 ssh2 [preauth]
Sep 30 20:37:12 compute-0 sshd-session[27883]: Disconnecting authenticating user root 185.217.1.246 port 46247: Too many authentication failures [preauth]
Sep 30 20:37:17 compute-0 sshd-session[27885]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 25068 ssh2 [preauth]
Sep 30 20:37:17 compute-0 sshd-session[27885]: Disconnecting authenticating user root 185.217.1.246 port 25068: Too many authentication failures [preauth]
Sep 30 20:37:20 compute-0 sshd-session[27887]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 1355 ssh2 [preauth]
Sep 30 20:37:20 compute-0 sshd-session[27887]: Disconnecting authenticating user root 185.217.1.246 port 1355: Too many authentication failures [preauth]
Sep 30 20:37:30 compute-0 sshd-session[27889]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 15605 ssh2 [preauth]
Sep 30 20:37:30 compute-0 sshd-session[27889]: Disconnecting authenticating user root 185.217.1.246 port 15605: Too many authentication failures [preauth]
Sep 30 20:37:41 compute-0 sshd-session[27891]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 5832 ssh2 [preauth]
Sep 30 20:37:41 compute-0 sshd-session[27891]: Disconnecting authenticating user root 185.217.1.246 port 5832: Too many authentication failures [preauth]
Sep 30 20:37:46 compute-0 sshd-session[27893]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 64983 ssh2 [preauth]
Sep 30 20:37:46 compute-0 sshd-session[27893]: Disconnecting authenticating user root 185.217.1.246 port 64983: Too many authentication failures [preauth]
Sep 30 20:37:54 compute-0 sshd-session[27895]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 48721 ssh2 [preauth]
Sep 30 20:37:54 compute-0 sshd-session[27895]: Disconnecting authenticating user root 185.217.1.246 port 48721: Too many authentication failures [preauth]
Sep 30 20:38:02 compute-0 sshd-session[27897]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 45777 ssh2 [preauth]
Sep 30 20:38:02 compute-0 sshd-session[27897]: Disconnecting authenticating user root 185.217.1.246 port 45777: Too many authentication failures [preauth]
Sep 30 20:38:07 compute-0 sshd-session[27899]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 1132 ssh2 [preauth]
Sep 30 20:38:07 compute-0 sshd-session[27899]: Disconnecting authenticating user root 185.217.1.246 port 1132: Too many authentication failures [preauth]
Sep 30 20:38:15 compute-0 sshd-session[27901]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 21822 ssh2 [preauth]
Sep 30 20:38:15 compute-0 sshd-session[27901]: Disconnecting authenticating user root 185.217.1.246 port 21822: Too many authentication failures [preauth]
Sep 30 20:38:20 compute-0 sshd-session[27903]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 29847 ssh2 [preauth]
Sep 30 20:38:20 compute-0 sshd-session[27903]: Disconnecting authenticating user root 185.217.1.246 port 29847: Too many authentication failures [preauth]
Sep 30 20:38:23 compute-0 sshd-session[27905]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 61755 ssh2 [preauth]
Sep 30 20:38:23 compute-0 sshd-session[27905]: Disconnecting authenticating user root 185.217.1.246 port 61755: Too many authentication failures [preauth]
Sep 30 20:38:28 compute-0 sshd-session[27907]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 37019 ssh2 [preauth]
Sep 30 20:38:28 compute-0 sshd-session[27907]: Disconnecting authenticating user root 185.217.1.246 port 37019: Too many authentication failures [preauth]
Sep 30 20:38:32 compute-0 sshd-session[27909]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 12595 ssh2 [preauth]
Sep 30 20:38:32 compute-0 sshd-session[27909]: Disconnecting authenticating user root 185.217.1.246 port 12595: Too many authentication failures [preauth]
Sep 30 20:38:36 compute-0 sshd-session[27911]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 32362 ssh2 [preauth]
Sep 30 20:38:36 compute-0 sshd-session[27911]: Disconnecting authenticating user root 185.217.1.246 port 32362: Too many authentication failures [preauth]
Sep 30 20:38:43 compute-0 sshd-session[27913]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 2465 ssh2 [preauth]
Sep 30 20:38:43 compute-0 sshd-session[27913]: Disconnecting authenticating user root 185.217.1.246 port 2465: Too many authentication failures [preauth]
Sep 30 20:38:46 compute-0 sshd-session[27915]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 51776 ssh2 [preauth]
Sep 30 20:38:46 compute-0 sshd-session[27915]: Disconnecting authenticating user root 185.217.1.246 port 51776: Too many authentication failures [preauth]
Sep 30 20:38:55 compute-0 sshd-session[27917]: error: maximum authentication attempts exceeded for root from 185.217.1.246 port 26284 ssh2 [preauth]
Sep 30 20:38:55 compute-0 sshd-session[27917]: Disconnecting authenticating user root 185.217.1.246 port 26284: Too many authentication failures [preauth]
Sep 30 20:38:59 compute-0 sshd-session[27919]: Disconnecting authenticating user root 185.217.1.246 port 38438: Change of username or service not allowed: (root,ssh-connection) -> (polkadot,ssh-connection) [preauth]
Sep 30 20:39:03 compute-0 sshd-session[27921]: Invalid user polkadot from 185.217.1.246 port 13382
Sep 30 20:39:04 compute-0 sshd-session[27921]: Disconnecting invalid user polkadot 185.217.1.246 port 13382: Change of username or service not allowed: (polkadot,ssh-connection) -> (partimag,ssh-connection) [preauth]
Sep 30 20:39:11 compute-0 sshd-session[27923]: Invalid user partimag from 185.217.1.246 port 63888
Sep 30 20:39:11 compute-0 sshd-session[27923]: Disconnecting invalid user partimag 185.217.1.246 port 63888: Change of username or service not allowed: (partimag,ssh-connection) -> (stake,ssh-connection) [preauth]
Sep 30 20:39:16 compute-0 sshd-session[27925]: Invalid user stake from 185.217.1.246 port 42352
Sep 30 20:39:17 compute-0 sshd-session[27925]: Disconnecting invalid user stake 185.217.1.246 port 42352: Change of username or service not allowed: (stake,ssh-connection) -> (test,ssh-connection) [preauth]
Sep 30 20:39:20 compute-0 sshd-session[27927]: Invalid user test from 185.217.1.246 port 29157
Sep 30 20:39:21 compute-0 sshd-session[27927]: Disconnecting invalid user test 185.217.1.246 port 29157: Change of username or service not allowed: (test,ssh-connection) -> (dot,ssh-connection) [preauth]
Sep 30 20:39:24 compute-0 sshd-session[27929]: Invalid user dot from 185.217.1.246 port 60487
Sep 30 20:39:24 compute-0 sshd-session[27929]: Disconnecting invalid user dot 185.217.1.246 port 60487: Change of username or service not allowed: (dot,ssh-connection) -> (xrp,ssh-connection) [preauth]
Sep 30 20:39:28 compute-0 sshd-session[27931]: Invalid user xrp from 185.217.1.246 port 40292
Sep 30 20:39:29 compute-0 sshd-session[27931]: Disconnecting invalid user xrp 185.217.1.246 port 40292: Change of username or service not allowed: (xrp,ssh-connection) -> (btc,ssh-connection) [preauth]
Sep 30 20:39:32 compute-0 sshd-session[27933]: Invalid user btc from 185.217.1.246 port 61206
Sep 30 20:39:32 compute-0 sshd-session[27933]: Disconnecting invalid user btc 185.217.1.246 port 61206: Change of username or service not allowed: (btc,ssh-connection) -> (riscv,ssh-connection) [preauth]
Sep 30 20:39:37 compute-0 sshd-session[27935]: Invalid user riscv from 185.217.1.246 port 40427
Sep 30 20:39:38 compute-0 sshd-session[27935]: Disconnecting invalid user riscv 185.217.1.246 port 40427: Change of username or service not allowed: (riscv,ssh-connection) -> (user,ssh-connection) [preauth]
Sep 30 20:39:40 compute-0 sshd-session[27937]: Invalid user user from 185.217.1.246 port 11408
Sep 30 20:39:42 compute-0 sshd-session[27937]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 11408 ssh2 [preauth]
Sep 30 20:39:42 compute-0 sshd-session[27937]: Disconnecting invalid user user 185.217.1.246 port 11408: Too many authentication failures [preauth]
Sep 30 20:39:44 compute-0 sshd-session[27939]: Invalid user user from 185.217.1.246 port 37449
Sep 30 20:39:46 compute-0 sshd-session[27939]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 37449 ssh2 [preauth]
Sep 30 20:39:46 compute-0 sshd-session[27939]: Disconnecting invalid user user 185.217.1.246 port 37449: Too many authentication failures [preauth]
Sep 30 20:39:49 compute-0 sshd-session[27941]: Invalid user user from 185.217.1.246 port 15544
Sep 30 20:39:53 compute-0 sshd-session[27941]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 15544 ssh2 [preauth]
Sep 30 20:39:53 compute-0 sshd-session[27941]: Disconnecting invalid user user 185.217.1.246 port 15544: Too many authentication failures [preauth]
Sep 30 20:39:55 compute-0 sshd-session[27943]: Invalid user user from 185.217.1.246 port 7304
Sep 30 20:39:57 compute-0 sshd-session[27943]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 7304 ssh2 [preauth]
Sep 30 20:39:57 compute-0 sshd-session[27943]: Disconnecting invalid user user 185.217.1.246 port 7304: Too many authentication failures [preauth]
Sep 30 20:40:01 compute-0 sshd-session[27945]: Invalid user user from 185.217.1.246 port 46787
Sep 30 20:40:02 compute-0 sshd-session[27945]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 46787 ssh2 [preauth]
Sep 30 20:40:02 compute-0 sshd-session[27945]: Disconnecting invalid user user 185.217.1.246 port 46787: Too many authentication failures [preauth]
Sep 30 20:40:05 compute-0 sshd-session[27948]: Invalid user user from 185.217.1.246 port 22492
Sep 30 20:40:08 compute-0 sshd-session[27948]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 22492 ssh2 [preauth]
Sep 30 20:40:08 compute-0 sshd-session[27948]: Disconnecting invalid user user 185.217.1.246 port 22492: Too many authentication failures [preauth]
Sep 30 20:40:12 compute-0 sshd-session[27951]: Invalid user user from 185.217.1.246 port 10351
Sep 30 20:40:17 compute-0 sshd-session[27951]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 10351 ssh2 [preauth]
Sep 30 20:40:17 compute-0 sshd-session[27951]: Disconnecting invalid user user 185.217.1.246 port 10351: Too many authentication failures [preauth]
Sep 30 20:40:20 compute-0 sshd-session[27953]: Invalid user user from 185.217.1.246 port 34603
Sep 30 20:40:22 compute-0 sshd-session[27953]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 34603 ssh2 [preauth]
Sep 30 20:40:22 compute-0 sshd-session[27953]: Disconnecting invalid user user 185.217.1.246 port 34603: Too many authentication failures [preauth]
Sep 30 20:40:27 compute-0 sshd-session[27955]: Invalid user user from 185.217.1.246 port 11488
Sep 30 20:40:30 compute-0 sshd-session[27955]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 11488 ssh2 [preauth]
Sep 30 20:40:30 compute-0 sshd-session[27955]: Disconnecting invalid user user 185.217.1.246 port 11488: Too many authentication failures [preauth]
Sep 30 20:40:34 compute-0 sshd-session[27957]: Invalid user user from 185.217.1.246 port 17633
Sep 30 20:40:36 compute-0 sshd-session[27957]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 17633 ssh2 [preauth]
Sep 30 20:40:36 compute-0 sshd-session[27957]: Disconnecting invalid user user 185.217.1.246 port 17633: Too many authentication failures [preauth]
Sep 30 20:40:40 compute-0 sshd-session[27960]: Invalid user Sujan from 185.156.73.233 port 39816
Sep 30 20:40:40 compute-0 sshd-session[27959]: Invalid user user from 185.217.1.246 port 57646
Sep 30 20:40:40 compute-0 sshd-session[27960]: Connection closed by invalid user Sujan 185.156.73.233 port 39816 [preauth]
Sep 30 20:40:43 compute-0 sshd-session[27959]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 57646 ssh2 [preauth]
Sep 30 20:40:43 compute-0 sshd-session[27959]: Disconnecting invalid user user 185.217.1.246 port 57646: Too many authentication failures [preauth]
Sep 30 20:40:46 compute-0 sshd-session[27963]: Invalid user user from 185.217.1.246 port 42607
Sep 30 20:40:50 compute-0 sshd-session[27963]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 42607 ssh2 [preauth]
Sep 30 20:40:50 compute-0 sshd-session[27963]: Disconnecting invalid user user 185.217.1.246 port 42607: Too many authentication failures [preauth]
Sep 30 20:40:55 compute-0 sshd-session[27965]: Invalid user user from 185.217.1.246 port 48040
Sep 30 20:40:59 compute-0 sshd-session[27965]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 48040 ssh2 [preauth]
Sep 30 20:40:59 compute-0 sshd-session[27965]: Disconnecting invalid user user 185.217.1.246 port 48040: Too many authentication failures [preauth]
Sep 30 20:41:01 compute-0 sshd-session[27967]: Invalid user user from 185.217.1.246 port 50994
Sep 30 20:41:04 compute-0 sshd-session[27967]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 50994 ssh2 [preauth]
Sep 30 20:41:04 compute-0 sshd-session[27967]: Disconnecting invalid user user 185.217.1.246 port 50994: Too many authentication failures [preauth]
Sep 30 20:41:08 compute-0 sshd-session[27969]: Invalid user user from 185.217.1.246 port 28109
Sep 30 20:41:09 compute-0 sshd-session[27969]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 28109 ssh2 [preauth]
Sep 30 20:41:09 compute-0 sshd-session[27969]: Disconnecting invalid user user 185.217.1.246 port 28109: Too many authentication failures [preauth]
Sep 30 20:41:11 compute-0 sshd-session[27971]: Invalid user user from 185.217.1.246 port 7788
Sep 30 20:41:16 compute-0 sshd-session[27971]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 7788 ssh2 [preauth]
Sep 30 20:41:16 compute-0 sshd-session[27971]: Disconnecting invalid user user 185.217.1.246 port 7788: Too many authentication failures [preauth]
Sep 30 20:41:17 compute-0 sshd-session[27973]: Invalid user user from 185.217.1.246 port 2828
Sep 30 20:41:18 compute-0 sshd-session[27975]: banner exchange: Connection from 91.238.181.93 port 65233: invalid format
Sep 30 20:41:19 compute-0 sshd-session[27973]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 2828 ssh2 [preauth]
Sep 30 20:41:19 compute-0 sshd-session[27973]: Disconnecting invalid user user 185.217.1.246 port 2828: Too many authentication failures [preauth]
Sep 30 20:41:21 compute-0 sshd-session[27976]: Invalid user user from 185.217.1.246 port 27387
Sep 30 20:41:23 compute-0 sshd-session[27976]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 27387 ssh2 [preauth]
Sep 30 20:41:23 compute-0 sshd-session[27976]: Disconnecting invalid user user 185.217.1.246 port 27387: Too many authentication failures [preauth]
Sep 30 20:41:26 compute-0 sshd-session[27978]: Invalid user user from 185.217.1.246 port 9031
Sep 30 20:41:30 compute-0 sshd-session[27978]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 9031 ssh2 [preauth]
Sep 30 20:41:30 compute-0 sshd-session[27978]: Disconnecting invalid user user 185.217.1.246 port 9031: Too many authentication failures [preauth]
Sep 30 20:41:34 compute-0 sshd-session[27980]: Invalid user user from 185.217.1.246 port 8550
Sep 30 20:41:35 compute-0 sshd-session[27980]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 8550 ssh2 [preauth]
Sep 30 20:41:35 compute-0 sshd-session[27980]: Disconnecting invalid user user 185.217.1.246 port 8550: Too many authentication failures [preauth]
Sep 30 20:41:38 compute-0 sshd-session[27982]: Invalid user user from 185.217.1.246 port 39151
Sep 30 20:41:43 compute-0 sshd-session[27982]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 39151 ssh2 [preauth]
Sep 30 20:41:43 compute-0 sshd-session[27982]: Disconnecting invalid user user 185.217.1.246 port 39151: Too many authentication failures [preauth]
Sep 30 20:41:46 compute-0 sshd-session[27984]: Invalid user user from 185.217.1.246 port 50215
Sep 30 20:41:47 compute-0 sshd-session[27984]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 50215 ssh2 [preauth]
Sep 30 20:41:47 compute-0 sshd-session[27984]: Disconnecting invalid user user 185.217.1.246 port 50215: Too many authentication failures [preauth]
Sep 30 20:41:51 compute-0 sshd-session[27986]: Invalid user user from 185.217.1.246 port 30698
Sep 30 20:41:55 compute-0 sshd-session[27986]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 30698 ssh2 [preauth]
Sep 30 20:41:55 compute-0 sshd-session[27986]: Disconnecting invalid user user 185.217.1.246 port 30698: Too many authentication failures [preauth]
Sep 30 20:41:59 compute-0 sshd-session[27988]: Invalid user user from 185.217.1.246 port 30313
Sep 30 20:42:01 compute-0 sshd-session[27988]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 30313 ssh2 [preauth]
Sep 30 20:42:01 compute-0 sshd-session[27988]: Disconnecting invalid user user 185.217.1.246 port 30313: Too many authentication failures [preauth]
Sep 30 20:42:03 compute-0 sshd-session[27990]: Invalid user user from 185.217.1.246 port 10356
Sep 30 20:42:07 compute-0 sshd-session[27990]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 10356 ssh2 [preauth]
Sep 30 20:42:07 compute-0 sshd-session[27990]: Disconnecting invalid user user 185.217.1.246 port 10356: Too many authentication failures [preauth]
Sep 30 20:42:10 compute-0 sshd-session[27992]: Invalid user user from 185.217.1.246 port 54737
Sep 30 20:42:11 compute-0 sshd-session[27992]: error: maximum authentication attempts exceeded for invalid user user from 185.217.1.246 port 54737 ssh2 [preauth]
Sep 30 20:42:11 compute-0 sshd-session[27992]: Disconnecting invalid user user 185.217.1.246 port 54737: Too many authentication failures [preauth]
Sep 30 20:42:13 compute-0 sshd-session[27994]: Invalid user user from 185.217.1.246 port 28459
Sep 30 20:42:13 compute-0 sshd-session[27994]: Disconnecting invalid user user 185.217.1.246 port 28459: Change of username or service not allowed: (user,ssh-connection) -> (gwei,ssh-connection) [preauth]
Sep 30 20:42:15 compute-0 sshd-session[27996]: Invalid user gwei from 185.217.1.246 port 36272
Sep 30 20:42:15 compute-0 sshd-session[27996]: Connection closed by invalid user gwei 185.217.1.246 port 36272 [preauth]
Sep 30 20:45:43 compute-0 sshd-session[27999]: Accepted publickey for zuul from 192.168.122.30 port 51126 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:45:43 compute-0 systemd-logind[792]: New session 8 of user zuul.
Sep 30 20:45:43 compute-0 systemd[1]: Started Session 8 of User zuul.
Sep 30 20:45:43 compute-0 sshd-session[27999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:45:44 compute-0 python3.9[28152]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:45:46 compute-0 sudo[28331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fclmuscfztycxxzleojxudgdxrquohmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265145.6393232-61-106663061488856/AnsiballZ_command.py'
Sep 30 20:45:46 compute-0 sudo[28331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:45:46 compute-0 python3.9[28333]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:45:53 compute-0 sudo[28331]: pam_unix(sudo:session): session closed for user root
Sep 30 20:45:53 compute-0 sshd-session[28002]: Connection closed by 192.168.122.30 port 51126
Sep 30 20:45:53 compute-0 sshd-session[27999]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:45:53 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Sep 30 20:45:53 compute-0 systemd[1]: session-8.scope: Consumed 8.061s CPU time.
Sep 30 20:45:53 compute-0 systemd-logind[792]: Session 8 logged out. Waiting for processes to exit.
Sep 30 20:45:53 compute-0 systemd-logind[792]: Removed session 8.
Sep 30 20:46:10 compute-0 sshd-session[28390]: Accepted publickey for zuul from 192.168.122.30 port 47920 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:46:10 compute-0 systemd-logind[792]: New session 9 of user zuul.
Sep 30 20:46:10 compute-0 systemd[1]: Started Session 9 of User zuul.
Sep 30 20:46:10 compute-0 sshd-session[28390]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:46:11 compute-0 python3.9[28543]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 20:46:12 compute-0 python3.9[28717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:13 compute-0 sudo[28867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvzdbnxpucgswiexqufifyezktykpryp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265172.7796943-98-242728830296251/AnsiballZ_command.py'
Sep 30 20:46:13 compute-0 sudo[28867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:13 compute-0 python3.9[28869]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:46:13 compute-0 sudo[28867]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:14 compute-0 sudo[29020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqinhojpboydtrqzgjffwckntgqjvla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265173.8968291-134-101961261947473/AnsiballZ_stat.py'
Sep 30 20:46:14 compute-0 sudo[29020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:14 compute-0 python3.9[29022]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:46:14 compute-0 sudo[29020]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:15 compute-0 sudo[29172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyjkxzxgvjokcogjzpbbjovnazditsmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265174.8357613-158-108850985818903/AnsiballZ_file.py'
Sep 30 20:46:15 compute-0 sudo[29172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:15 compute-0 python3.9[29174]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:15 compute-0 sudo[29172]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:16 compute-0 sudo[29324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aovhsabxgfjnxwcmmfbljfvjfggaific ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265175.8067307-182-38705517729097/AnsiballZ_stat.py'
Sep 30 20:46:16 compute-0 sudo[29324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:16 compute-0 python3.9[29326]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:46:16 compute-0 sudo[29324]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:16 compute-0 sudo[29447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isolzkfumafldlszbzcibwfrkmomctvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265175.8067307-182-38705517729097/AnsiballZ_copy.py'
Sep 30 20:46:16 compute-0 sudo[29447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:17 compute-0 python3.9[29449]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265175.8067307-182-38705517729097/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:17 compute-0 sudo[29447]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:17 compute-0 sudo[29599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxqeafyjsndaqdzasdqhouuybnccmstn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265177.4602258-227-264702771532435/AnsiballZ_setup.py'
Sep 30 20:46:17 compute-0 sudo[29599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:18 compute-0 python3.9[29601]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:18 compute-0 sudo[29599]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:18 compute-0 sudo[29755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sawwwcnjkutzlrfszanyvdhutxblspry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265178.6199174-251-7707731607800/AnsiballZ_file.py'
Sep 30 20:46:18 compute-0 sudo[29755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:19 compute-0 python3.9[29757]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:46:19 compute-0 sudo[29755]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:20 compute-0 python3.9[29907]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:46:27 compute-0 python3.9[30162]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:46:28 compute-0 python3.9[30312]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:30 compute-0 python3.9[30466]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:46:31 compute-0 sudo[30622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jidfbqvmbdednlsuuctmwgjdscaokxcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265190.6552794-395-245248816412877/AnsiballZ_setup.py'
Sep 30 20:46:31 compute-0 sudo[30622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:31 compute-0 python3.9[30624]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:46:31 compute-0 sudo[30622]: pam_unix(sudo:session): session closed for user root
Sep 30 20:46:32 compute-0 sudo[30706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzjwawkorviiptbjnynajdvzezblqbsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265190.6552794-395-245248816412877/AnsiballZ_dnf.py'
Sep 30 20:46:32 compute-0 sudo[30706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:46:32 compute-0 python3.9[30708]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:47:07 compute-0 sshd-session[30900]: Invalid user seekcy from 49.64.169.153 port 49625
Sep 30 20:47:07 compute-0 sshd-session[30900]: Received disconnect from 49.64.169.153 port 49625:11: Bye Bye [preauth]
Sep 30 20:47:07 compute-0 sshd-session[30900]: Disconnected from invalid user seekcy 49.64.169.153 port 49625 [preauth]
Sep 30 20:47:26 compute-0 systemd[1]: Reloading.
Sep 30 20:47:26 compute-0 systemd-rc-local-generator[30976]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:27 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Sep 30 20:47:27 compute-0 systemd[1]: Reloading.
Sep 30 20:47:27 compute-0 systemd-rc-local-generator[31011]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:27 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Sep 30 20:47:27 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Sep 30 20:47:27 compute-0 systemd[1]: Reloading.
Sep 30 20:47:27 compute-0 systemd-rc-local-generator[31060]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:47:27 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Sep 30 20:47:28 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 20:47:28 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 20:48:35 compute-0 kernel: SELinux:  Converting 2714 SID table entries...
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:48:35 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:48:36 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Sep 30 20:48:36 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:48:36 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:48:36 compute-0 systemd[1]: Reloading.
Sep 30 20:48:36 compute-0 systemd-rc-local-generator[31377]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:48:36 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:48:37 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 20:48:37 compute-0 PackageKit[31761]: daemon start
Sep 30 20:48:37 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 20:48:38 compute-0 sudo[30706]: pam_unix(sudo:session): session closed for user root
Sep 30 20:48:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:48:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:48:38 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.476s CPU time.
Sep 30 20:48:38 compute-0 systemd[1]: run-r357693588a9f4f78b81c41faf4e56b75.service: Deactivated successfully.
Sep 30 20:49:01 compute-0 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Sep 30 20:49:01 compute-0 irqbalance[784]: IRQ 26 affinity is now unmanaged
Sep 30 20:49:04 compute-0 sshd-session[32169]: Invalid user  from 66.240.192.82 port 59753
Sep 30 20:49:04 compute-0 sshd-session[32169]: Connection closed by invalid user  66.240.192.82 port 59753 [preauth]
Sep 30 20:49:09 compute-0 sudo[32296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qutoeekdgxlhvshrloqnwutkrvdrefrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265348.8601186-431-233507489016758/AnsiballZ_command.py'
Sep 30 20:49:09 compute-0 sudo[32296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:09 compute-0 python3.9[32298]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:49:10 compute-0 sudo[32296]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:11 compute-0 sudo[32577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjyvqmmaqgnfgnoklswyfxtroiwatsan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265350.8710246-455-117318673486536/AnsiballZ_selinux.py'
Sep 30 20:49:11 compute-0 sudo[32577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:11 compute-0 python3.9[32579]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 20:49:11 compute-0 sudo[32577]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:12 compute-0 sudo[32729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddktxnjipcolovcqtaaqkmvwwllnzmki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265352.4621677-488-232601370415759/AnsiballZ_command.py'
Sep 30 20:49:12 compute-0 sudo[32729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:13 compute-0 python3.9[32731]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 20:49:14 compute-0 sudo[32729]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:14 compute-0 sudo[32882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eofxjhsmdxptfmmnrfxyaudsfxkiphzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265354.4050689-512-32923606108461/AnsiballZ_file.py'
Sep 30 20:49:14 compute-0 sudo[32882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:15 compute-0 python3.9[32884]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:49:15 compute-0 sudo[32882]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:18 compute-0 sudo[33034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqrtthnxzfcydvkriawzccqfszwypotu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265358.1881862-536-239111967559035/AnsiballZ_mount.py'
Sep 30 20:49:18 compute-0 sudo[33034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:18 compute-0 python3.9[33036]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 20:49:18 compute-0 sudo[33034]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:20 compute-0 sudo[33186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yolrekntciuuxilvgtgdgzrbhgoffmvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265359.9484966-620-222594067147269/AnsiballZ_file.py'
Sep 30 20:49:20 compute-0 sudo[33186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:22 compute-0 python3.9[33188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:22 compute-0 sudo[33186]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:23 compute-0 sudo[33338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqixacxmnqgnwyfsadfyvayaovyrxcgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265362.918189-644-109105723853046/AnsiballZ_stat.py'
Sep 30 20:49:23 compute-0 sudo[33338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:23 compute-0 python3.9[33340]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:23 compute-0 sudo[33338]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:23 compute-0 sudo[33461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evcywfurdbexkdwhpznobavzyvbzzfkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265362.918189-644-109105723853046/AnsiballZ_copy.py'
Sep 30 20:49:23 compute-0 sudo[33461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:24 compute-0 python3.9[33463]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265362.918189-644-109105723853046/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:49:24 compute-0 sudo[33461]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:25 compute-0 sudo[33613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vimclzxemxdnbnhdptrpvxqrsjteoxuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265365.1488976-725-169000288581318/AnsiballZ_getent.py'
Sep 30 20:49:25 compute-0 sudo[33613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:25 compute-0 python3.9[33615]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 20:49:25 compute-0 sudo[33613]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:25 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:49:26 compute-0 sudo[33767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-valxttutkzsgzzifdoiblkbezuhzcrqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265366.1210818-749-80001520572409/AnsiballZ_group.py'
Sep 30 20:49:26 compute-0 sudo[33767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:26 compute-0 python3.9[33769]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:49:27 compute-0 groupadd[33770]: group added to /etc/group: name=qemu, GID=107
Sep 30 20:49:27 compute-0 groupadd[33770]: group added to /etc/gshadow: name=qemu
Sep 30 20:49:27 compute-0 groupadd[33770]: new group: name=qemu, GID=107
Sep 30 20:49:27 compute-0 sudo[33767]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:28 compute-0 sudo[33925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slurcnhurmppklwksqfkieahvkxdizsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265367.607779-773-68155291732991/AnsiballZ_user.py'
Sep 30 20:49:28 compute-0 sudo[33925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:28 compute-0 python3.9[33927]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 20:49:28 compute-0 useradd[33929]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 20:49:28 compute-0 sudo[33925]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:29 compute-0 sudo[34085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzptsbeaqmsxxizsqfvrhckwfgvsypdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265368.7301152-797-57693815432957/AnsiballZ_getent.py'
Sep 30 20:49:29 compute-0 sudo[34085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:29 compute-0 python3.9[34087]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 20:49:29 compute-0 sudo[34085]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:29 compute-0 sudo[34238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhayitiuupiavcxgmomiqplbwgphfcwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265369.565652-821-229370212465608/AnsiballZ_group.py'
Sep 30 20:49:29 compute-0 sudo[34238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:30 compute-0 python3.9[34240]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:49:30 compute-0 groupadd[34241]: group added to /etc/group: name=hugetlbfs, GID=42477
Sep 30 20:49:30 compute-0 groupadd[34241]: group added to /etc/gshadow: name=hugetlbfs
Sep 30 20:49:30 compute-0 groupadd[34241]: new group: name=hugetlbfs, GID=42477
Sep 30 20:49:30 compute-0 sudo[34238]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:30 compute-0 sudo[34396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckbbxacbldvmtgnjrjmgakeoluqltpsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265370.4417584-848-9384906855211/AnsiballZ_file.py'
Sep 30 20:49:30 compute-0 sudo[34396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:30 compute-0 python3.9[34398]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 20:49:30 compute-0 sudo[34396]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:31 compute-0 sudo[34548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbweqyupiamcwpimekzuvzdqqzzzfltj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265371.5072472-881-279273059544165/AnsiballZ_dnf.py'
Sep 30 20:49:31 compute-0 sudo[34548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:32 compute-0 python3.9[34550]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:49:36 compute-0 sudo[34548]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:37 compute-0 sudo[34701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmskmkmoaggdoyrnxvjqwxvwwubpzpvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265377.025574-905-163149378194432/AnsiballZ_file.py'
Sep 30 20:49:37 compute-0 sudo[34701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:37 compute-0 python3.9[34703]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:37 compute-0 sudo[34701]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:38 compute-0 sudo[34853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clxnowvcafzyglomwswiycoadrxtjzvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265378.0348773-929-278282107449133/AnsiballZ_stat.py'
Sep 30 20:49:38 compute-0 sudo[34853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:38 compute-0 python3.9[34855]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:38 compute-0 sudo[34853]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:39 compute-0 sudo[34976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhyikemofegpywzorxnmkcbprawjydqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265378.0348773-929-278282107449133/AnsiballZ_copy.py'
Sep 30 20:49:39 compute-0 sudo[34976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:39 compute-0 python3.9[34978]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265378.0348773-929-278282107449133/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:39 compute-0 sudo[34976]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:40 compute-0 sudo[35128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqmifdhgpwrhfkuiabnedxqyakwvvisd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265379.5231216-974-181805710728054/AnsiballZ_systemd.py'
Sep 30 20:49:40 compute-0 sudo[35128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:40 compute-0 python3.9[35130]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:49:40 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 20:49:40 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Sep 30 20:49:40 compute-0 kernel: Bridge firewalling registered
Sep 30 20:49:40 compute-0 systemd-modules-load[35134]: Inserted module 'br_netfilter'
Sep 30 20:49:40 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 20:49:40 compute-0 sudo[35128]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:41 compute-0 sudo[35287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vebqdahllctnoqukxbicmljapuwyvwmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265380.9044018-998-44032070353519/AnsiballZ_stat.py'
Sep 30 20:49:41 compute-0 sudo[35287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:41 compute-0 python3.9[35289]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:49:41 compute-0 sudo[35287]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:42 compute-0 sudo[35410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usqpzwlmlllvizstnivkwyaipqzvrpuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265380.9044018-998-44032070353519/AnsiballZ_copy.py'
Sep 30 20:49:42 compute-0 sudo[35410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:42 compute-0 python3.9[35412]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265380.9044018-998-44032070353519/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:49:42 compute-0 sudo[35410]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:43 compute-0 sudo[35562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkwxvevzlupicxdqjaxvtmkbrrgbhbgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265382.8114672-1052-267283855096429/AnsiballZ_dnf.py'
Sep 30 20:49:43 compute-0 sudo[35562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:43 compute-0 python3.9[35564]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:49:48 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 20:49:48 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 20:49:49 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:49:49 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:49:49 compute-0 systemd[1]: Reloading.
Sep 30 20:49:49 compute-0 systemd-rc-local-generator[35628]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:49:49 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:49:50 compute-0 sudo[35562]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:51 compute-0 python3.9[36992]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:49:52 compute-0 python3.9[37904]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 20:49:52 compute-0 python3.9[38585]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:49:53 compute-0 sudo[39531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txongymiwlwdcxtlsrriobhwgmfpxqtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265393.3788009-1169-208192808399295/AnsiballZ_command.py'
Sep 30 20:49:53 compute-0 sudo[39531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:53 compute-0 python3.9[39549]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:49:54 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 20:49:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:49:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:49:54 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.968s CPU time.
Sep 30 20:49:54 compute-0 systemd[1]: run-r577bf0aa7efc4402be9c5befb6d960c0.service: Deactivated successfully.
Sep 30 20:49:54 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 20:49:54 compute-0 sudo[39531]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:55 compute-0 sudo[40145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrjtbvobjbarwklmszitoahahyhiryie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265395.1932273-1196-234025789212244/AnsiballZ_systemd.py'
Sep 30 20:49:55 compute-0 sudo[40145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:49:55 compute-0 python3.9[40147]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:49:55 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 20:49:55 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 20:49:55 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 20:49:55 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 20:49:56 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 20:49:56 compute-0 sudo[40145]: pam_unix(sudo:session): session closed for user root
Sep 30 20:49:57 compute-0 python3.9[40308]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 20:50:00 compute-0 sudo[40458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nveycjxiintkytuighnwthffawywjmye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265399.8315005-1367-266366210249822/AnsiballZ_systemd.py'
Sep 30 20:50:00 compute-0 sudo[40458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:00 compute-0 python3.9[40460]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:50:00 compute-0 systemd[1]: Reloading.
Sep 30 20:50:00 compute-0 systemd-rc-local-generator[40490]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:50:00 compute-0 sudo[40458]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:01 compute-0 sudo[40647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzncssxltggbevzdcvdnogwlclnmblfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265400.8966484-1367-229417594144547/AnsiballZ_systemd.py'
Sep 30 20:50:01 compute-0 sudo[40647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:01 compute-0 python3.9[40649]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:50:01 compute-0 systemd[1]: Reloading.
Sep 30 20:50:01 compute-0 systemd-rc-local-generator[40675]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:50:01 compute-0 sudo[40647]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:02 compute-0 sudo[40836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rprzddbrqduddnrceuztkhfjzzljscpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265402.4024432-1415-166476840776633/AnsiballZ_command.py'
Sep 30 20:50:02 compute-0 sudo[40836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:02 compute-0 python3.9[40838]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:02 compute-0 sudo[40836]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:03 compute-0 sudo[40989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trmybrrxjnhjdzofthxgvdravulgfbog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265403.2311435-1439-10112926261052/AnsiballZ_command.py'
Sep 30 20:50:03 compute-0 sudo[40989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:03 compute-0 python3.9[40991]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:03 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Sep 30 20:50:03 compute-0 sudo[40989]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:04 compute-0 sudo[41142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnkhhyzabrbkwczgwrxqptgxtcyugifr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265404.0795224-1463-249583217782649/AnsiballZ_command.py'
Sep 30 20:50:04 compute-0 sudo[41142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:04 compute-0 python3.9[41144]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:06 compute-0 sudo[41142]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:06 compute-0 sudo[41304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xufxxmzjakwdbyzuxlwoyjhkecpjnlhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265406.540934-1487-230411954562140/AnsiballZ_command.py'
Sep 30 20:50:06 compute-0 sudo[41304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:07 compute-0 python3.9[41306]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:07 compute-0 sudo[41304]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:07 compute-0 sudo[41457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfvwdkfvydzuhmohshvxqeiqrsfpkfge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265407.3635547-1511-142505360976550/AnsiballZ_systemd.py'
Sep 30 20:50:07 compute-0 sudo[41457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:08 compute-0 python3.9[41459]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:50:08 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 20:50:08 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Sep 30 20:50:08 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Sep 30 20:50:08 compute-0 systemd[1]: Starting Apply Kernel Variables...
Sep 30 20:50:08 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 20:50:08 compute-0 systemd[1]: Finished Apply Kernel Variables.
Sep 30 20:50:08 compute-0 sudo[41457]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:08 compute-0 sshd-session[28393]: Connection closed by 192.168.122.30 port 47920
Sep 30 20:50:08 compute-0 sshd-session[28390]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:50:08 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Sep 30 20:50:08 compute-0 systemd[1]: session-9.scope: Consumed 2min 20.251s CPU time.
Sep 30 20:50:08 compute-0 systemd-logind[792]: Session 9 logged out. Waiting for processes to exit.
Sep 30 20:50:08 compute-0 systemd-logind[792]: Removed session 9.
Sep 30 20:50:13 compute-0 sshd-session[41490]: Accepted publickey for zuul from 192.168.122.30 port 36434 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:50:13 compute-0 systemd-logind[792]: New session 10 of user zuul.
Sep 30 20:50:13 compute-0 systemd[1]: Started Session 10 of User zuul.
Sep 30 20:50:13 compute-0 sshd-session[41490]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:50:14 compute-0 python3.9[41643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:16 compute-0 python3.9[41797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:17 compute-0 sudo[41951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmamxoyrqcxvbvtqbsbdrcwhwywoptzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265417.3447196-115-37709568953786/AnsiballZ_command.py'
Sep 30 20:50:17 compute-0 sudo[41951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:18 compute-0 python3.9[41953]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:18 compute-0 sudo[41951]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:19 compute-0 python3.9[42104]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:20 compute-0 sudo[42258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuvwsthpxfafyhdcjfbqiglpjaoosylc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265419.706898-175-191243433179228/AnsiballZ_setup.py'
Sep 30 20:50:20 compute-0 sudo[42258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:20 compute-0 python3.9[42260]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:50:20 compute-0 sudo[42258]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:21 compute-0 sudo[42342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watghwkawcnvmeajrtvcdrvijjldbsgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265419.706898-175-191243433179228/AnsiballZ_dnf.py'
Sep 30 20:50:21 compute-0 sudo[42342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:21 compute-0 python3.9[42344]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:50:22 compute-0 sudo[42342]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:23 compute-0 sudo[42495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nstlaxhhujdhygajrlewysgxqraetvhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265422.6973257-211-50194331228518/AnsiballZ_setup.py'
Sep 30 20:50:23 compute-0 sudo[42495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:23 compute-0 python3.9[42497]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:50:23 compute-0 sudo[42495]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:24 compute-0 sudo[42666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzygmkebrgfrzkkrxvaccpwoemwbrnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265423.9262197-244-96537518065581/AnsiballZ_file.py'
Sep 30 20:50:24 compute-0 sudo[42666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:24 compute-0 python3.9[42668]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:50:24 compute-0 sudo[42666]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:25 compute-0 sudo[42818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djvnwanzlsdqfbbdfdqjhcnovxgystqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265424.8755987-268-61534250669641/AnsiballZ_command.py'
Sep 30 20:50:25 compute-0 sudo[42818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:25 compute-0 python3.9[42820]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:50:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat4263973531-merged.mount: Deactivated successfully.
Sep 30 20:50:25 compute-0 podman[42821]: 2025-09-30 20:50:25.459239378 +0000 UTC m=+0.078126809 system refresh
Sep 30 20:50:25 compute-0 sudo[42818]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:26 compute-0 sudo[42981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjkwqnwvearzgjlyxmxkaowxkaidzvgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265425.8308349-292-3024156984802/AnsiballZ_stat.py'
Sep 30 20:50:26 compute-0 sudo[42981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:26 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:50:26 compute-0 python3.9[42983]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:50:26 compute-0 sudo[42981]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:27 compute-0 sudo[43104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkljfmrfrwtoubgbbgsmbbtefuxovrws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265425.8308349-292-3024156984802/AnsiballZ_copy.py'
Sep 30 20:50:27 compute-0 sudo[43104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:27 compute-0 python3.9[43106]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265425.8308349-292-3024156984802/.source.json follow=False _original_basename=podman_network_config.j2 checksum=f3a74a1ee989afdb89b09c793cd92e6ae510c83a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:50:27 compute-0 sudo[43104]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:27 compute-0 sudo[43256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beohixmeryiwbryhvopwmdeqbmjvbzwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265427.5838046-337-233169868003418/AnsiballZ_stat.py'
Sep 30 20:50:27 compute-0 sudo[43256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:28 compute-0 python3.9[43258]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:50:28 compute-0 sudo[43256]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:28 compute-0 sudo[43379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptyspadtxnkihbnxhsoxfndcrzkifli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265427.5838046-337-233169868003418/AnsiballZ_copy.py'
Sep 30 20:50:28 compute-0 sudo[43379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:28 compute-0 python3.9[43381]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265427.5838046-337-233169868003418/.source.conf follow=False _original_basename=registries.conf.j2 checksum=74ad3fdf1c9c551f4957cab58c04bb2f8b0dc3e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:28 compute-0 sudo[43379]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:29 compute-0 sudo[43531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysoinhgynpuwwscqrypphrvltmotkgpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265429.2730367-385-36749443631985/AnsiballZ_ini_file.py'
Sep 30 20:50:29 compute-0 sudo[43531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:29 compute-0 python3.9[43533]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:29 compute-0 sudo[43531]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:30 compute-0 sudo[43683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubevdacxyrthlvxhhbxunrfhilewdbiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265430.1166558-385-257250885891597/AnsiballZ_ini_file.py'
Sep 30 20:50:30 compute-0 sudo[43683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:30 compute-0 python3.9[43685]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:30 compute-0 sudo[43683]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:31 compute-0 sudo[43835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtvwobsuhtoyklafdpusqidoexcmxezg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265430.938213-385-128396372356543/AnsiballZ_ini_file.py'
Sep 30 20:50:31 compute-0 sudo[43835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:31 compute-0 python3.9[43837]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:31 compute-0 sudo[43835]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:32 compute-0 sudo[43987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zouftjgkitjbkdseiwlnboegdlrhdkzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265431.7178648-385-22536996867262/AnsiballZ_ini_file.py'
Sep 30 20:50:32 compute-0 sudo[43987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:32 compute-0 python3.9[43989]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:50:32 compute-0 sudo[43987]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:33 compute-0 python3.9[44139]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:50:34 compute-0 sudo[44292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukrtgvegzyhmfpzqgaffxbygiofryefk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265433.671777-505-198709986302107/AnsiballZ_dnf.py'
Sep 30 20:50:34 compute-0 sudo[44292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:34 compute-0 python3.9[44294]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:35 compute-0 sudo[44292]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:36 compute-0 sudo[44446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsoptwotvtmqbypvqiqnnlkrwuwtpndj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265435.7769701-529-79205682508880/AnsiballZ_dnf.py'
Sep 30 20:50:36 compute-0 sudo[44446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:36 compute-0 sshd-session[44140]: Connection closed by authenticating user sshd 185.156.73.233 port 29516 [preauth]
Sep 30 20:50:36 compute-0 python3.9[44448]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:38 compute-0 sudo[44446]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:38 compute-0 sudo[44606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwvfzzzpumtrwfmliqmpuwivodhdgzwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265438.6565118-559-248818936251731/AnsiballZ_dnf.py'
Sep 30 20:50:39 compute-0 sudo[44606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:39 compute-0 python3.9[44608]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:40 compute-0 sudo[44606]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:41 compute-0 sudo[44759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lputbnuozzjikxsxbhgxevjhkbklsxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265440.8633425-586-157927816439741/AnsiballZ_dnf.py'
Sep 30 20:50:41 compute-0 sudo[44759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:41 compute-0 python3.9[44761]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:42 compute-0 sudo[44759]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:43 compute-0 sudo[44912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-derpijtucayobprmjmbgkjydnmilyvek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265443.1769433-619-37353806746330/AnsiballZ_dnf.py'
Sep 30 20:50:43 compute-0 sudo[44912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:43 compute-0 python3.9[44914]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:45 compute-0 sudo[44912]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:45 compute-0 sudo[45068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nycfbddgrmkaxrbuixgmydwztawjjikv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265445.534217-643-241406999982704/AnsiballZ_dnf.py'
Sep 30 20:50:45 compute-0 sudo[45068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:46 compute-0 python3.9[45070]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:48 compute-0 sudo[45068]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:49 compute-0 sudo[45237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctkdojsozprwpygmbxwvpksbhwiuxaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265449.2716513-670-41157064602805/AnsiballZ_dnf.py'
Sep 30 20:50:49 compute-0 sudo[45237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:49 compute-0 python3.9[45239]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:51 compute-0 sudo[45237]: pam_unix(sudo:session): session closed for user root
Sep 30 20:50:51 compute-0 sudo[45390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yibfhtraaebxpgzngfhhlgvpdthuxpfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265451.437851-697-232220979738785/AnsiballZ_dnf.py'
Sep 30 20:50:51 compute-0 sudo[45390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:50:51 compute-0 python3.9[45392]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:50:56 compute-0 sshd-session[45400]: Invalid user seekcy from 49.64.169.153 port 49056
Sep 30 20:51:19 compute-0 sudo[45390]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:20 compute-0 sudo[45729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkkbepkoxlntjmlptqkkdxzeyunyolab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.1720724-730-265623422816590/AnsiballZ_file.py'
Sep 30 20:51:20 compute-0 sudo[45729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:20 compute-0 python3.9[45731]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:51:20 compute-0 sudo[45729]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:21 compute-0 sudo[45904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhdzcqgdmkowbuzegkprsbbqlogndggs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.9178162-754-89542939223965/AnsiballZ_stat.py'
Sep 30 20:51:21 compute-0 sudo[45904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:21 compute-0 python3.9[45906]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:51:21 compute-0 sudo[45904]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:21 compute-0 sudo[46027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koaxovcsqrrlrghmukjlkojcnmywvrih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265480.9178162-754-89542939223965/AnsiballZ_copy.py'
Sep 30 20:51:21 compute-0 sudo[46027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:22 compute-0 python3.9[46029]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759265480.9178162-754-89542939223965/.source.json _original_basename=.ezs0zdv8 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:51:22 compute-0 sudo[46027]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:23 compute-0 sudo[46179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcdrwmhvarzdntwrpzcvnyhmygyfuqmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265482.5796099-808-155230084097327/AnsiballZ_podman_image.py'
Sep 30 20:51:23 compute-0 sudo[46179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:23 compute-0 python3.9[46181]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1460057144-lower\x2dmapped.mount: Deactivated successfully.
Sep 30 20:51:30 compute-0 podman[46191]: 2025-09-30 20:51:30.098347144 +0000 UTC m=+6.611699797 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 20:51:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:30 compute-0 sudo[46179]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:31 compute-0 sudo[46490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qinynnbudesfntnohvcfcnbtduqyehrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265491.0095537-835-268301566888645/AnsiballZ_podman_image.py'
Sep 30 20:51:31 compute-0 sudo[46490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:31 compute-0 python3.9[46492]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-0 podman[46503]: 2025-09-30 20:51:34.071798552 +0000 UTC m=+2.442919206 image pull 7ffac6b06b247caf26cf673b775a5f070f2fa1a6008cf0b0964af7e905ba86a5 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:34 compute-0 sudo[46490]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:35 compute-0 sudo[46760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihovffdrlwngcgeyqoqgxvidknigbuhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265495.0106206-874-71076099779392/AnsiballZ_podman_image.py'
Sep 30 20:51:35 compute-0 sudo[46760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:35 compute-0 python3.9[46762]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-0 podman[46774]: 2025-09-30 20:51:36.657435488 +0000 UTC m=+1.118532113 image pull 80aeb93432d60c5f52c5325081f51dbf5658fe1615083ed284852e8f6df43250 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 20:51:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:36 compute-0 sudo[46760]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:37 compute-0 sudo[47007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hssxgnmeyahfnjwagtynuwknvwauvmvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265497.4060566-901-164423329550927/AnsiballZ_podman_image.py'
Sep 30 20:51:37 compute-0 sudo[47007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:37 compute-0 python3.9[47009]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-0 podman[47022]: 2025-09-30 20:51:48.637765626 +0000 UTC m=+10.619242438 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 20:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:48 compute-0 sudo[47007]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:51 compute-0 sudo[47301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blbhmrbjzyexdaltapenriybgzwrxpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265511.7040153-934-171912869487742/AnsiballZ_podman_image.py'
Sep 30 20:51:51 compute-0 sudo[47301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:52 compute-0 python3.9[47303]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-0 podman[47315]: 2025-09-30 20:51:56.163632718 +0000 UTC m=+3.904290729 image pull c1fbb3a9fe801a81492a24a592ec5927cb36487bb102738c2047084bd3d79886 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Sep 30 20:51:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:56 compute-0 sudo[47301]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:56 compute-0 sudo[47570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjjmhhwuxwxcqmclfzutjmnkemqlcqid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265516.579559-934-80856742055528/AnsiballZ_podman_image.py'
Sep 30 20:51:56 compute-0 sudo[47570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:51:57 compute-0 python3.9[47572]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 20:51:58 compute-0 podman[47585]: 2025-09-30 20:51:58.53976028 +0000 UTC m=+1.379734033 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 20:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:51:58 compute-0 sudo[47570]: pam_unix(sudo:session): session closed for user root
Sep 30 20:51:59 compute-0 sshd-session[41493]: Connection closed by 192.168.122.30 port 36434
Sep 30 20:51:59 compute-0 sshd-session[41490]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:51:59 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Sep 30 20:51:59 compute-0 systemd[1]: session-10.scope: Consumed 1min 34.562s CPU time.
Sep 30 20:51:59 compute-0 systemd-logind[792]: Session 10 logged out. Waiting for processes to exit.
Sep 30 20:51:59 compute-0 systemd-logind[792]: Removed session 10.
Sep 30 20:52:04 compute-0 sshd-session[47730]: Accepted publickey for zuul from 192.168.122.30 port 60036 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:52:04 compute-0 systemd-logind[792]: New session 11 of user zuul.
Sep 30 20:52:04 compute-0 systemd[1]: Started Session 11 of User zuul.
Sep 30 20:52:04 compute-0 sshd-session[47730]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:52:05 compute-0 python3.9[47883]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:07 compute-0 sudo[48037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grnzqaqxzxfywsyigeegaqaxjbhtboik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265526.7894537-73-236211953177009/AnsiballZ_getent.py'
Sep 30 20:52:07 compute-0 sudo[48037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:07 compute-0 python3.9[48039]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 20:52:07 compute-0 sudo[48037]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:08 compute-0 sudo[48190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmmqqnkrirzlxhufvjlgtiiyowgynbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265527.7327242-97-250301630318678/AnsiballZ_group.py'
Sep 30 20:52:08 compute-0 sudo[48190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:08 compute-0 python3.9[48192]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 20:52:08 compute-0 groupadd[48193]: group added to /etc/group: name=openvswitch, GID=42476
Sep 30 20:52:08 compute-0 groupadd[48193]: group added to /etc/gshadow: name=openvswitch
Sep 30 20:52:08 compute-0 groupadd[48193]: new group: name=openvswitch, GID=42476
Sep 30 20:52:08 compute-0 sudo[48190]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:09 compute-0 sudo[48348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoutmlutwiiutkdmzrsjetudwmnhecfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265529.307522-121-167475305436455/AnsiballZ_user.py'
Sep 30 20:52:09 compute-0 sudo[48348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:10 compute-0 python3.9[48350]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 20:52:10 compute-0 useradd[48352]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 20:52:10 compute-0 useradd[48352]: add 'openvswitch' to group 'hugetlbfs'
Sep 30 20:52:10 compute-0 useradd[48352]: add 'openvswitch' to shadow group 'hugetlbfs'
Sep 30 20:52:10 compute-0 sudo[48348]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:12 compute-0 sudo[48508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvqahaytuffvoyhhgfgxeizqsntcnlci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265531.6542091-151-261957243012932/AnsiballZ_setup.py'
Sep 30 20:52:12 compute-0 sudo[48508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:12 compute-0 python3.9[48510]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:52:12 compute-0 sudo[48508]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:13 compute-0 sudo[48592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axlaulojjzqtfnlhspetwdoiflphricv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265531.6542091-151-261957243012932/AnsiballZ_dnf.py'
Sep 30 20:52:13 compute-0 sudo[48592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:13 compute-0 python3.9[48594]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:52:14 compute-0 sudo[48592]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:16 compute-0 sudo[48753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfumphdbgabexozoeaauczfnygswuavc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265536.1529562-193-38216544672636/AnsiballZ_dnf.py'
Sep 30 20:52:16 compute-0 sudo[48753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:16 compute-0 python3.9[48755]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:28 compute-0 kernel: SELinux:  Converting 2725 SID table entries...
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:52:28 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:52:28 compute-0 groupadd[48778]: group added to /etc/group: name=unbound, GID=993
Sep 30 20:52:28 compute-0 groupadd[48778]: group added to /etc/gshadow: name=unbound
Sep 30 20:52:28 compute-0 groupadd[48778]: new group: name=unbound, GID=993
Sep 30 20:52:28 compute-0 useradd[48785]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Sep 30 20:52:28 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Sep 30 20:52:28 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Sep 30 20:52:29 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:52:29 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:52:30 compute-0 systemd[1]: Reloading.
Sep 30 20:52:30 compute-0 systemd-sysv-generator[49286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:30 compute-0 systemd-rc-local-generator[49283]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:30 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:52:30 compute-0 sudo[48753]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:30 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:52:30 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:52:30 compute-0 systemd[1]: run-rc4953a224c1c4fd9967e1f64e186509f.service: Deactivated successfully.
Sep 30 20:52:43 compute-0 sudo[49857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpdvvmwzowjjbwavwuuhfmgftziohnfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265562.3964798-217-50867056976612/AnsiballZ_systemd.py'
Sep 30 20:52:43 compute-0 sudo[49857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:43 compute-0 python3.9[49859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:52:43 compute-0 systemd[1]: Reloading.
Sep 30 20:52:43 compute-0 systemd-rc-local-generator[49886]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:43 compute-0 systemd-sysv-generator[49893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:43 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Sep 30 20:52:43 compute-0 chown[49901]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Sep 30 20:52:43 compute-0 ovs-ctl[49906]: /etc/openvswitch/conf.db does not exist ... (warning).
Sep 30 20:52:43 compute-0 ovs-ctl[49906]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Sep 30 20:52:43 compute-0 ovs-ctl[49906]: Starting ovsdb-server [  OK  ]
Sep 30 20:52:43 compute-0 ovs-vsctl[49955]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Sep 30 20:52:44 compute-0 ovs-vsctl[49975]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"3b817c7f-1137-4e8f-8263-8c5e6eddafa4\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Sep 30 20:52:44 compute-0 ovs-ctl[49906]: Configuring Open vSwitch system IDs [  OK  ]
Sep 30 20:52:44 compute-0 ovs-ctl[49906]: Enabling remote OVSDB managers [  OK  ]
Sep 30 20:52:44 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Sep 30 20:52:44 compute-0 ovs-vsctl[49981]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 20:52:44 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Sep 30 20:52:44 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Sep 30 20:52:44 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Sep 30 20:52:44 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Sep 30 20:52:44 compute-0 ovs-ctl[50026]: Inserting openvswitch module [  OK  ]
Sep 30 20:52:44 compute-0 ovs-ctl[49995]: Starting ovs-vswitchd [  OK  ]
Sep 30 20:52:44 compute-0 ovs-vsctl[50043]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 20:52:44 compute-0 ovs-ctl[49995]: Enabling remote OVSDB managers [  OK  ]
Sep 30 20:52:44 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Sep 30 20:52:44 compute-0 systemd[1]: Starting Open vSwitch...
Sep 30 20:52:44 compute-0 systemd[1]: Finished Open vSwitch.
Sep 30 20:52:44 compute-0 sudo[49857]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:46 compute-0 python3.9[50195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:47 compute-0 sudo[50345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxaniufvirggfiaibpounpouwemcoutu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265566.9585483-271-7145539228329/AnsiballZ_sefcontext.py'
Sep 30 20:52:47 compute-0 sudo[50345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:47 compute-0 python3.9[50347]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 20:52:49 compute-0 kernel: SELinux:  Converting 2739 SID table entries...
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 20:52:49 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 20:52:49 compute-0 sudo[50345]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:50 compute-0 python3.9[50502]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:52:51 compute-0 sudo[50658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfcybgledcxovddykehowayjvetyqosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265571.1861775-325-142034653209071/AnsiballZ_dnf.py'
Sep 30 20:52:51 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Sep 30 20:52:51 compute-0 sudo[50658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:51 compute-0 python3.9[50660]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:53 compute-0 sudo[50658]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:53 compute-0 sudo[50811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wemltziwcanhjwmrdqouwrdwwgjlbdds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265573.3406737-349-210510106244259/AnsiballZ_command.py'
Sep 30 20:52:53 compute-0 sudo[50811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:53 compute-0 python3.9[50813]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:52:54 compute-0 sudo[50811]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:55 compute-0 sudo[51098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybwppcfpvgicnjxrqcmfklsbzbbffskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265575.1062982-373-45911252463691/AnsiballZ_file.py'
Sep 30 20:52:55 compute-0 sudo[51098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:55 compute-0 python3.9[51100]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 20:52:55 compute-0 sudo[51098]: pam_unix(sudo:session): session closed for user root
Sep 30 20:52:56 compute-0 python3.9[51250]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:52:57 compute-0 sudo[51402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myxspyjgwoxvstbphmlpbyljxszsdyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265577.0208426-421-141093460078924/AnsiballZ_dnf.py'
Sep 30 20:52:57 compute-0 sudo[51402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:52:57 compute-0 python3.9[51404]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:52:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:52:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:52:59 compute-0 systemd[1]: Reloading.
Sep 30 20:52:59 compute-0 sshd[1008]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 45400
Sep 30 20:52:59 compute-0 systemd-rc-local-generator[51444]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:52:59 compute-0 systemd-sysv-generator[51447]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:52:59 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:53:00 compute-0 sudo[51402]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:00 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:53:00 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:53:00 compute-0 systemd[1]: run-r7f6305e029be496eaf882745e15c650a.service: Deactivated successfully.
Sep 30 20:53:00 compute-0 sudo[51720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afkszcgjzmqdbeubpezuzyqdfbkdoydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265580.462449-445-50973001622261/AnsiballZ_systemd.py'
Sep 30 20:53:00 compute-0 sudo[51720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:01 compute-0 python3.9[51722]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:53:01 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 20:53:01 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Sep 30 20:53:01 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Sep 30 20:53:01 compute-0 systemd[1]: Stopping Network Manager...
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1383] caught SIGTERM, shutting down normally.
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1404] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1404] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1405] dhcp4 (eth0): state changed no lease
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1407] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:53:01 compute-0 NetworkManager[3942]: <info>  [1759265581.1481] exiting (success)
Sep 30 20:53:01 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:53:01 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:53:01 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 20:53:01 compute-0 systemd[1]: Stopped Network Manager.
Sep 30 20:53:01 compute-0 systemd[1]: NetworkManager.service: Consumed 15.536s CPU time, 4.0M memory peak, read 0B from disk, written 26.5K to disk.
Sep 30 20:53:01 compute-0 systemd[1]: Starting Network Manager...
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.2168] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:94c9591a-3f00-4303-bfc6-d623b91048fe)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.2170] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.2227] manager[0x559cc5b04090]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 20:53:01 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 20:53:01 compute-0 systemd[1]: Started Hostname Service.
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3114] hostname: hostname: using hostnamed
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3115] hostname: static hostname changed from (none) to "compute-0"
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3122] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3129] manager[0x559cc5b04090]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3129] manager[0x559cc5b04090]: rfkill: WWAN hardware radio set enabled
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3148] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3158] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3158] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3159] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3160] manager: Networking is enabled by state file
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3164] settings: Loaded settings plugin: keyfile (internal)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3168] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3192] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3202] dhcp: init: Using DHCP client 'internal'
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3204] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3209] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3214] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3220] device (lo): Activation: starting connection 'lo' (81f1075e-2cab-4304-8039-837dd8b444d5)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3226] device (eth0): carrier: link connected
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3230] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3234] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3235] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3241] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3248] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3254] device (eth1): carrier: link connected
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3257] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3262] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1694c4d7-a3f7-50e2-b10a-6b7d951bb318) (indicated)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3262] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3268] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3274] device (eth1): Activation: starting connection 'ci-private-network' (1694c4d7-a3f7-50e2-b10a-6b7d951bb318)
Sep 30 20:53:01 compute-0 systemd[1]: Started Network Manager.
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3286] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3295] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3307] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3309] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3312] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3315] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3320] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3322] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3326] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3336] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3339] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3356] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3374] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3387] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3388] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3396] device (lo): Activation: successful, device activated.
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3404] dhcp4 (eth0): state changed new lease, address=38.102.83.69
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3411] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 20:53:01 compute-0 systemd[1]: Starting Network Manager Wait Online...
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3483] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3489] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3491] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3492] manager: NetworkManager state is now CONNECTED_LOCAL
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3495] device (eth1): Activation: successful, device activated.
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3505] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3507] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3509] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3511] device (eth0): Activation: successful, device activated.
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3514] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 20:53:01 compute-0 NetworkManager[51733]: <info>  [1759265581.3516] manager: startup complete
Sep 30 20:53:01 compute-0 sudo[51720]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:01 compute-0 systemd[1]: Finished Network Manager Wait Online.
Sep 30 20:53:01 compute-0 sudo[51946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vupdgsridzjyjhnwncddihrhednpyyrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265581.6827426-469-144772401674468/AnsiballZ_dnf.py'
Sep 30 20:53:01 compute-0 sudo[51946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:02 compute-0 python3.9[51948]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:53:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 20:53:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 20:53:07 compute-0 systemd[1]: Reloading.
Sep 30 20:53:07 compute-0 systemd-rc-local-generator[52001]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:53:07 compute-0 systemd-sysv-generator[52004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:53:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 20:53:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 20:53:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 20:53:08 compute-0 systemd[1]: run-rf93ec46eb33e4204b0fb11a9490f2401.service: Deactivated successfully.
Sep 30 20:53:08 compute-0 sudo[51946]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:09 compute-0 sudo[52412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxtpbmzhkurxvijivtgsxldpplmccpcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265589.3541486-505-244022074185926/AnsiballZ_stat.py'
Sep 30 20:53:09 compute-0 sudo[52412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:09 compute-0 python3.9[52414]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:53:09 compute-0 sudo[52412]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:10 compute-0 sudo[52564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snzmkgevdvarmbpzqhmdchyrcdwrgwyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265590.2277772-532-211546051041535/AnsiballZ_ini_file.py'
Sep 30 20:53:10 compute-0 sudo[52564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:11 compute-0 python3.9[52566]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:11 compute-0 sudo[52564]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:11 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:53:11 compute-0 sudo[52718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thuvcgwvamslnzszgnltoulozztsglic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265591.439769-562-26336884373726/AnsiballZ_ini_file.py'
Sep 30 20:53:11 compute-0 sudo[52718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:11 compute-0 python3.9[52720]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:11 compute-0 sudo[52718]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:12 compute-0 sudo[52870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcybzzryoqqmdqpcpkqcpzumquumssqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265592.0784564-562-255091396890637/AnsiballZ_ini_file.py'
Sep 30 20:53:12 compute-0 sudo[52870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:12 compute-0 python3.9[52872]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:12 compute-0 sudo[52870]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:13 compute-0 sudo[53022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvzdlvjqqtydokxqjeaejfnjwfmfbtwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265593.0000238-607-66849942136948/AnsiballZ_ini_file.py'
Sep 30 20:53:13 compute-0 sudo[53022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:13 compute-0 python3.9[53024]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:13 compute-0 sudo[53022]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:14 compute-0 sudo[53174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuotccmoaoprtjihnnsnvvawvznxirmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265593.7454116-607-147458686066889/AnsiballZ_ini_file.py'
Sep 30 20:53:14 compute-0 sudo[53174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:14 compute-0 python3.9[53176]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:14 compute-0 sudo[53174]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:14 compute-0 sudo[53326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbgehucqlrlsikfianavoxohakjmsewd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265594.564442-652-50006055242294/AnsiballZ_stat.py'
Sep 30 20:53:14 compute-0 sudo[53326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:15 compute-0 python3.9[53328]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:15 compute-0 sudo[53326]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:15 compute-0 sudo[53449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-damuxmfqlpwwinizlesjcwogtpioqdiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265594.564442-652-50006055242294/AnsiballZ_copy.py'
Sep 30 20:53:15 compute-0 sudo[53449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:15 compute-0 python3.9[53451]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265594.564442-652-50006055242294/.source _original_basename=.islv8eln follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:15 compute-0 sudo[53449]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:16 compute-0 sudo[53601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztmpihplqfkrglpacprilceqamzihwxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265596.0873666-697-4541820646230/AnsiballZ_file.py'
Sep 30 20:53:16 compute-0 sudo[53601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:16 compute-0 python3.9[53603]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:16 compute-0 sudo[53601]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:17 compute-0 sudo[53753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sedlrlhmmnbnammbjxwxbjkierewnyoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265596.8911073-721-25352959811622/AnsiballZ_edpm_os_net_config_mappings.py'
Sep 30 20:53:17 compute-0 sudo[53753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:17 compute-0 python3.9[53755]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Sep 30 20:53:17 compute-0 sudo[53753]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:18 compute-0 sudo[53905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efuiodehjmgoipnzhomjwdiluvbbeufw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265597.8965297-748-34799439297564/AnsiballZ_file.py'
Sep 30 20:53:18 compute-0 sudo[53905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:18 compute-0 python3.9[53907]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:18 compute-0 sudo[53905]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:19 compute-0 sudo[54057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqaumqbgexwcrbwtqqqvfeujxxhkpqom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265598.903399-778-200678147225735/AnsiballZ_stat.py'
Sep 30 20:53:19 compute-0 sudo[54057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:19 compute-0 sudo[54057]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:19 compute-0 sudo[54180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsurynftbxxehuwtzpcmuhslzvqjxvvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265598.903399-778-200678147225735/AnsiballZ_copy.py'
Sep 30 20:53:19 compute-0 sudo[54180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:20 compute-0 sudo[54180]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:21 compute-0 sudo[54332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnyxtdvinyagjosjptqxhtjaxrvadjxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265600.4813757-823-165387353601714/AnsiballZ_slurp.py'
Sep 30 20:53:21 compute-0 sudo[54332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:21 compute-0 python3.9[54334]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Sep 30 20:53:21 compute-0 sudo[54332]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:22 compute-0 sudo[54507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqjxgwalrporpljodzfukuvnbgkklpiy ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.546363-850-221711375228174/async_wrapper.py j355751741012 300 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.546363-850-221711375228174/AnsiballZ_edpm_os_net_config.py _'
Sep 30 20:53:22 compute-0 sudo[54507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:22 compute-0 ansible-async_wrapper.py[54509]: Invoked with j355751741012 300 /home/zuul/.ansible/tmp/ansible-tmp-1759265601.546363-850-221711375228174/AnsiballZ_edpm_os_net_config.py _
Sep 30 20:53:22 compute-0 ansible-async_wrapper.py[54512]: Starting module and watcher
Sep 30 20:53:22 compute-0 ansible-async_wrapper.py[54512]: Start watching 54513 (300)
Sep 30 20:53:22 compute-0 ansible-async_wrapper.py[54513]: Start module (54513)
Sep 30 20:53:22 compute-0 ansible-async_wrapper.py[54509]: Return async_wrapper task started.
Sep 30 20:53:22 compute-0 sudo[54507]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:22 compute-0 python3.9[54514]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Sep 30 20:53:23 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Sep 30 20:53:23 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Sep 30 20:53:23 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Sep 30 20:53:23 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Sep 30 20:53:23 compute-0 kernel: cfg80211: failed to load regulatory.db
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.6139] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.6170] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7000] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7002] audit: op="connection-add" uuid="9b0d21f7-babc-4262-b1d9-6693666519eb" name="br-ex-br" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7025] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7027] audit: op="connection-add" uuid="cafc52fc-d9a5-4f74-934b-84b6564dda4f" name="br-ex-port" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7040] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7044] audit: op="connection-add" uuid="9c074521-cabb-4ffb-bbda-941c5d4d1f08" name="eth1-port" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7056] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7057] audit: op="connection-add" uuid="e7bf2ccc-0ac4-49c8-b5e0-0a37e72aa1c1" name="vlan20-port" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7068] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7069] audit: op="connection-add" uuid="c550afee-e443-48dd-89a1-442af5a9f33c" name="vlan21-port" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7082] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7082] audit: op="connection-add" uuid="796bfd42-358c-42e8-b9ae-ca4ac9527c9c" name="vlan22-port" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7106] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7125] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7126] audit: op="connection-add" uuid="59c2a2e7-a29d-4423-9df9-98dc55fcd044" name="br-ex-if" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7192] audit: op="connection-update" uuid="1694c4d7-a3f7-50e2-b10a-6b7d951bb318" name="ci-private-network" args="connection.controller,connection.timestamp,connection.slave-type,connection.port-type,connection.master,ovs-external-ids.data,ipv4.dns,ipv4.never-default,ipv4.method,ipv4.routing-rules,ipv4.routes,ipv4.addresses,ovs-interface.type,ipv6.dns,ipv6.addr-gen-mode,ipv6.method,ipv6.routing-rules,ipv6.routes,ipv6.addresses" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7214] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7215] audit: op="connection-add" uuid="592e935b-34ca-4611-ab83-aa03ef6c80a0" name="vlan20-if" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7233] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7235] audit: op="connection-add" uuid="827daf1f-951b-4b39-9d39-4a023d982eb7" name="vlan21-if" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7252] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7253] audit: op="connection-add" uuid="0d7bb37d-095d-4b15-b19e-cd6255ddf326" name="vlan22-if" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7267] audit: op="connection-delete" uuid="f3f92eec-e382-3400-9db1-93c4109b9992" name="Wired connection 1" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7281] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7297] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7302] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (9b0d21f7-babc-4262-b1d9-6693666519eb)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7304] audit: op="connection-activate" uuid="9b0d21f7-babc-4262-b1d9-6693666519eb" name="br-ex-br" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7306] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7316] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7321] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (cafc52fc-d9a5-4f74-934b-84b6564dda4f)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7324] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7333] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7338] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (9c074521-cabb-4ffb-bbda-941c5d4d1f08)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7341] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7351] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7357] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (e7bf2ccc-0ac4-49c8-b5e0-0a37e72aa1c1)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7360] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7370] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7376] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (c550afee-e443-48dd-89a1-442af5a9f33c)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7379] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7389] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7394] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (796bfd42-358c-42e8-b9ae-ca4ac9527c9c)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7397] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7399] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7402] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7409] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7417] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7422] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (59c2a2e7-a29d-4423-9df9-98dc55fcd044)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7423] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7428] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7429] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7430] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7431] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7439] device (eth1): disconnecting for new activation request.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7439] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7441] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7444] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7446] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7448] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7453] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7457] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (592e935b-34ca-4611-ab83-aa03ef6c80a0)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7458] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7462] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7463] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7465] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7467] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7471] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7477] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (827daf1f-951b-4b39-9d39-4a023d982eb7)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7478] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7481] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7484] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7485] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7487] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7492] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7498] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (0d7bb37d-095d-4b15-b19e-cd6255ddf326)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7500] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7504] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7506] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7508] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7509] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7525] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7528] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7532] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7535] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7542] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7547] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: ovs-system: entered promiscuous mode
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7571] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7575] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7578] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7583] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: Timeout policy base is empty
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7587] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7590] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7592] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7599] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 systemd-udevd[54521]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7604] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7607] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7608] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7612] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7617] dhcp4 (eth0): canceled DHCP transaction
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7617] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7617] dhcp4 (eth0): state changed no lease
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7619] dhcp4 (eth0): activation: beginning transaction (no timeout)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7633] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7638] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54515 uid=0 result="fail" reason="Device is not activated"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7643] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7650] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Sep 30 20:53:24 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7735] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7740] dhcp4 (eth0): state changed new lease, address=38.102.83.69
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7789] device (eth1): disconnecting for new activation request.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7794] audit: op="connection-activate" uuid="1694c4d7-a3f7-50e2-b10a-6b7d951bb318" name="ci-private-network" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7820] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7905] device (eth1): Activation: starting connection 'ci-private-network' (1694c4d7-a3f7-50e2-b10a-6b7d951bb318)
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7929] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7938] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7953] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: br-ex: entered promiscuous mode
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7956] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54515 uid=0 result="success"
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7957] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7961] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: vlan22: entered promiscuous mode
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7962] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7963] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7964] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7973] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.7990] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8001] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8010] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8017] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8022] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8031] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8038] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8049] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8054] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8059] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8063] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8072] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 systemd-udevd[54519]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8085] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8091] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: vlan20: entered promiscuous mode
Sep 30 20:53:24 compute-0 systemd-udevd[54520]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8161] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8187] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8198] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8208] device (eth1): Activation: successful, device activated.
Sep 30 20:53:24 compute-0 kernel: vlan21: entered promiscuous mode
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8239] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8258] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8301] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8306] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8312] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8321] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8340] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8345] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8354] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8363] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8388] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8401] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8426] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8434] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8438] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8451] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8462] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8468] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 20:53:24 compute-0 NetworkManager[51733]: <info>  [1759265604.8479] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 20:53:25 compute-0 NetworkManager[51733]: <info>  [1759265605.9757] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.1618] checkpoint[0x559cc5ada950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.1627] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 sudo[54846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcjmehxnylgtlcurjnlunjutikgwugmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.6308916-850-118651911421054/AnsiballZ_async_status.py'
Sep 30 20:53:26 compute-0 sudo[54846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:26 compute-0 python3.9[54848]: ansible-ansible.legacy.async_status Invoked with jid=j355751741012.54509 mode=status _async_dir=/root/.ansible_async
Sep 30 20:53:26 compute-0 sudo[54846]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.4364] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.4379] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.6016] audit: op="networking-control" arg="global-dns-configuration" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.6254] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.6404] audit: op="networking-control" arg="global-dns-configuration" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.6676] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.8107] checkpoint[0x559cc5adaa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Sep 30 20:53:26 compute-0 NetworkManager[51733]: <info>  [1759265606.8111] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54515 uid=0 result="success"
Sep 30 20:53:26 compute-0 ansible-async_wrapper.py[54513]: Module complete (54513)
Sep 30 20:53:27 compute-0 ansible-async_wrapper.py[54512]: Done in kid B.
Sep 30 20:53:29 compute-0 sudo[54952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfnlufmhnqojllvpsayefhcucnnmrsqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.6308916-850-118651911421054/AnsiballZ_async_status.py'
Sep 30 20:53:29 compute-0 sudo[54952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:29 compute-0 python3.9[54954]: ansible-ansible.legacy.async_status Invoked with jid=j355751741012.54509 mode=status _async_dir=/root/.ansible_async
Sep 30 20:53:29 compute-0 sudo[54952]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:30 compute-0 sudo[55052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcxkgsschujmbwwgkgpjzziwnyfichol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265605.6308916-850-118651911421054/AnsiballZ_async_status.py'
Sep 30 20:53:30 compute-0 sudo[55052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:30 compute-0 python3.9[55054]: ansible-ansible.legacy.async_status Invoked with jid=j355751741012.54509 mode=cleanup _async_dir=/root/.ansible_async
Sep 30 20:53:30 compute-0 sudo[55052]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:31 compute-0 sudo[55204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwcilmdezxhponxfxqnlsovlybznwemm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265610.8541996-931-127259594888905/AnsiballZ_stat.py'
Sep 30 20:53:31 compute-0 sudo[55204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:31 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 20:53:31 compute-0 python3.9[55206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:31 compute-0 sudo[55204]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:31 compute-0 sudo[55329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrbxbavltktkwrbzhyxagihgzbqgxbrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265610.8541996-931-127259594888905/AnsiballZ_copy.py'
Sep 30 20:53:31 compute-0 sudo[55329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:32 compute-0 python3.9[55331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265610.8541996-931-127259594888905/.source.returncode _original_basename=.uunejn7q follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:32 compute-0 sudo[55329]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:32 compute-0 sudo[55481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otfisdnfxqyxyuzhkkigcxafxebhqhvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265612.4052172-979-30283434666536/AnsiballZ_stat.py'
Sep 30 20:53:32 compute-0 sudo[55481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:32 compute-0 python3.9[55483]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:53:32 compute-0 sudo[55481]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:33 compute-0 sudo[55605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxiztesrghsvevrbkwkspnwgnzahxhkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265612.4052172-979-30283434666536/AnsiballZ_copy.py'
Sep 30 20:53:33 compute-0 sudo[55605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:33 compute-0 python3.9[55607]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265612.4052172-979-30283434666536/.source.cfg _original_basename=.3s63rf28 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:33 compute-0 sudo[55605]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:34 compute-0 sudo[55757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdzowluyjzfnnbmoflspzcpwvffsfxrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265613.9403245-1024-123673320856680/AnsiballZ_systemd.py'
Sep 30 20:53:34 compute-0 sudo[55757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:34 compute-0 python3.9[55759]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:53:34 compute-0 systemd[1]: Reloading Network Manager...
Sep 30 20:53:34 compute-0 NetworkManager[51733]: <info>  [1759265614.7368] audit: op="reload" arg="0" pid=55763 uid=0 result="success"
Sep 30 20:53:34 compute-0 NetworkManager[51733]: <info>  [1759265614.7376] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Sep 30 20:53:34 compute-0 systemd[1]: Reloaded Network Manager.
Sep 30 20:53:34 compute-0 sudo[55757]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:36 compute-0 sshd-session[47733]: Connection closed by 192.168.122.30 port 60036
Sep 30 20:53:36 compute-0 sshd-session[47730]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:53:36 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Sep 30 20:53:36 compute-0 systemd[1]: session-11.scope: Consumed 51.538s CPU time.
Sep 30 20:53:36 compute-0 systemd-logind[792]: Session 11 logged out. Waiting for processes to exit.
Sep 30 20:53:36 compute-0 systemd-logind[792]: Removed session 11.
Sep 30 20:53:41 compute-0 sshd-session[55794]: Accepted publickey for zuul from 192.168.122.30 port 44394 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:53:41 compute-0 systemd-logind[792]: New session 12 of user zuul.
Sep 30 20:53:41 compute-0 systemd[1]: Started Session 12 of User zuul.
Sep 30 20:53:41 compute-0 sshd-session[55794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:53:42 compute-0 python3.9[55947]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:43 compute-0 python3.9[56102]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:44 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 20:53:45 compute-0 python3.9[56292]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:53:45 compute-0 sshd-session[55797]: Connection closed by 192.168.122.30 port 44394
Sep 30 20:53:45 compute-0 sshd-session[55794]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:53:45 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Sep 30 20:53:45 compute-0 systemd[1]: session-12.scope: Consumed 2.686s CPU time.
Sep 30 20:53:45 compute-0 systemd-logind[792]: Session 12 logged out. Waiting for processes to exit.
Sep 30 20:53:45 compute-0 systemd-logind[792]: Removed session 12.
Sep 30 20:53:50 compute-0 sshd-session[56321]: Accepted publickey for zuul from 192.168.122.30 port 46344 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:53:50 compute-0 systemd-logind[792]: New session 13 of user zuul.
Sep 30 20:53:50 compute-0 systemd[1]: Started Session 13 of User zuul.
Sep 30 20:53:50 compute-0 sshd-session[56321]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:53:52 compute-0 python3.9[56474]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:53 compute-0 python3.9[56629]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:53:54 compute-0 sudo[56783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cengdpgfpuxvxfldtxjvqxcxuvvknlnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265633.8071792-85-227713471572632/AnsiballZ_setup.py'
Sep 30 20:53:54 compute-0 sudo[56783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:54 compute-0 python3.9[56785]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:54 compute-0 sudo[56783]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:55 compute-0 sudo[56867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoknxrywztkgyqnbfwqjmksddsvprbki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265633.8071792-85-227713471572632/AnsiballZ_dnf.py'
Sep 30 20:53:55 compute-0 sudo[56867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:55 compute-0 python3.9[56869]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:53:56 compute-0 sudo[56867]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:57 compute-0 sudo[57021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zusjjrhpyjevbbvjafbzmxptpumwjgii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265636.9198027-121-166688094219532/AnsiballZ_setup.py'
Sep 30 20:53:57 compute-0 sudo[57021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:57 compute-0 python3.9[57023]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:53:57 compute-0 sudo[57021]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:58 compute-0 sudo[57212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leufysqhrdbinxvjevwvxkhesmtfvnwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265638.3202941-154-80546789274414/AnsiballZ_file.py'
Sep 30 20:53:58 compute-0 sudo[57212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:53:59 compute-0 python3.9[57214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:53:59 compute-0 sudo[57212]: pam_unix(sudo:session): session closed for user root
Sep 30 20:53:59 compute-0 sudo[57364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjaafuynvddnhjlopcxvnmbqvgemflwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265639.367503-178-223809881198645/AnsiballZ_command.py'
Sep 30 20:53:59 compute-0 sudo[57364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:00 compute-0 python3.9[57366]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:54:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:54:00 compute-0 sudo[57364]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:00 compute-0 sudo[57527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llgsuvoitwtsuoraeddifdlhpgfoeftl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265640.3917556-202-116879325889915/AnsiballZ_stat.py'
Sep 30 20:54:00 compute-0 sudo[57527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:01 compute-0 python3.9[57529]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:01 compute-0 sudo[57527]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:01 compute-0 sudo[57605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyxqiehyjiggeujybpdzllqhwmtlqoll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265640.3917556-202-116879325889915/AnsiballZ_file.py'
Sep 30 20:54:01 compute-0 sudo[57605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:01 compute-0 python3.9[57607]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:01 compute-0 sudo[57605]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:02 compute-0 sudo[57757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umxrihewlsaypdpgihnwtyuokqzkfxzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265641.9362922-238-236008553543064/AnsiballZ_stat.py'
Sep 30 20:54:02 compute-0 sudo[57757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:02 compute-0 python3.9[57759]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:02 compute-0 sudo[57757]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:02 compute-0 sudo[57835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwwnipkzzkfyolsdykmagtekfnxdxmcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265641.9362922-238-236008553543064/AnsiballZ_file.py'
Sep 30 20:54:02 compute-0 sudo[57835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:02 compute-0 python3.9[57837]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:03 compute-0 sudo[57835]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:03 compute-0 sudo[57987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxgnvphpjyxotvgbahcqkomopbqubipt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265643.3768518-277-248144050977759/AnsiballZ_ini_file.py'
Sep 30 20:54:03 compute-0 sudo[57987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:04 compute-0 python3.9[57989]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:04 compute-0 sudo[57987]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:04 compute-0 sudo[58139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwdhuibijjsjmmtnerfetikzsjiztgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265644.2416568-277-235204743936065/AnsiballZ_ini_file.py'
Sep 30 20:54:04 compute-0 sudo[58139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:04 compute-0 python3.9[58141]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:04 compute-0 sudo[58139]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:05 compute-0 sudo[58291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdzswkfpafmtwatwsipnaytphylvexqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265644.9608068-277-281411075591870/AnsiballZ_ini_file.py'
Sep 30 20:54:05 compute-0 sudo[58291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:05 compute-0 python3.9[58293]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:05 compute-0 sudo[58291]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:06 compute-0 sudo[58443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umceoqrtcgmjxbymgtpmnqvaaawfdjww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265645.7154217-277-23572480209884/AnsiballZ_ini_file.py'
Sep 30 20:54:06 compute-0 sudo[58443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:06 compute-0 python3.9[58445]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:06 compute-0 sudo[58443]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:07 compute-0 sudo[58595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuohwxbehiwokcsvnpfmmqxxtgudrlgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265646.8593063-370-68060195232262/AnsiballZ_dnf.py'
Sep 30 20:54:07 compute-0 sudo[58595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:07 compute-0 python3.9[58597]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:54:08 compute-0 sudo[58595]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:09 compute-0 sudo[58748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwfagtoyobipkdfnxmknafwshuptaipx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265649.259932-403-191568064990075/AnsiballZ_setup.py'
Sep 30 20:54:09 compute-0 sudo[58748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:09 compute-0 python3.9[58750]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:54:09 compute-0 sudo[58748]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:10 compute-0 sudo[58902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgsvuiychkbcytiuieefgeervjklqsff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265650.2037742-427-217561598430652/AnsiballZ_stat.py'
Sep 30 20:54:10 compute-0 sudo[58902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:10 compute-0 python3.9[58904]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:54:10 compute-0 sudo[58902]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:11 compute-0 sudo[59054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pztnqrwvpwenxvjknhnfjtkdluntpbtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265651.1719582-454-217061702345249/AnsiballZ_stat.py'
Sep 30 20:54:11 compute-0 sudo[59054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:11 compute-0 python3.9[59056]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:54:11 compute-0 sudo[59054]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:12 compute-0 sudo[59206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uplfvuwdyoylsghbthfyshoxyrxwxtyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265652.0741522-484-41550446102857/AnsiballZ_service_facts.py'
Sep 30 20:54:12 compute-0 sudo[59206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:12 compute-0 python3.9[59208]: ansible-service_facts Invoked
Sep 30 20:54:12 compute-0 network[59225]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:54:12 compute-0 network[59226]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:54:12 compute-0 network[59227]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:54:16 compute-0 sudo[59206]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:18 compute-0 sudo[59512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnkawcawadobeqummegyhalymhgrynrh ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759265658.327922-523-180240609458359/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759265658.327922-523-180240609458359/args'
Sep 30 20:54:18 compute-0 sudo[59512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:18 compute-0 sudo[59512]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:19 compute-0 sudo[59679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klfdqlvyikhrmopwgarfrizpfxidhzgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265659.1331985-556-106395075070599/AnsiballZ_dnf.py'
Sep 30 20:54:19 compute-0 sudo[59679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:19 compute-0 python3.9[59681]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:54:21 compute-0 sudo[59679]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:22 compute-0 sudo[59832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpbyrsvywcwifioxghloxwqsippnczir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265661.5503511-595-175630546160878/AnsiballZ_package_facts.py'
Sep 30 20:54:22 compute-0 sudo[59832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:22 compute-0 python3.9[59834]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 20:54:22 compute-0 sudo[59832]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:23 compute-0 sudo[59984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyakzuibajknzklwvgkojsjxndjmivan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265663.5949552-625-279438018702465/AnsiballZ_stat.py'
Sep 30 20:54:23 compute-0 sudo[59984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:24 compute-0 python3.9[59986]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:24 compute-0 sudo[59984]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:24 compute-0 sudo[60109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drxoeancrnlfdkbhgjbnyraqiheaonmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265663.5949552-625-279438018702465/AnsiballZ_copy.py'
Sep 30 20:54:24 compute-0 sudo[60109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:24 compute-0 python3.9[60111]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265663.5949552-625-279438018702465/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:24 compute-0 sudo[60109]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:25 compute-0 sudo[60263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygutghvapkeefsmyzlxymrqewgruiejf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265665.1845486-670-255788730671539/AnsiballZ_stat.py'
Sep 30 20:54:25 compute-0 sudo[60263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:25 compute-0 python3.9[60265]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:25 compute-0 sudo[60263]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:26 compute-0 sudo[60388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtpeauenbfvgcpfwohwbklnscknkthmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265665.1845486-670-255788730671539/AnsiballZ_copy.py'
Sep 30 20:54:26 compute-0 sudo[60388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:26 compute-0 python3.9[60390]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265665.1845486-670-255788730671539/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:26 compute-0 sudo[60388]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:27 compute-0 sudo[60542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxawpeyymmnkgmobnecfsgesgtutjude ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265667.4486117-733-25900964889408/AnsiballZ_lineinfile.py'
Sep 30 20:54:27 compute-0 sudo[60542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:28 compute-0 python3.9[60544]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:28 compute-0 sudo[60542]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:29 compute-0 sudo[60696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgdntjhdtfpencdeirukakhrhrwvaorg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265669.2511508-778-29270688956214/AnsiballZ_setup.py'
Sep 30 20:54:29 compute-0 sudo[60696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:29 compute-0 python3.9[60698]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:54:30 compute-0 sudo[60696]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:30 compute-0 sudo[60780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzylxfwmodxenmunztyeoeihrqgnaafm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265669.2511508-778-29270688956214/AnsiballZ_systemd.py'
Sep 30 20:54:30 compute-0 sudo[60780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:31 compute-0 python3.9[60782]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:31 compute-0 sudo[60780]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:32 compute-0 sudo[60934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaeutsqvdgkbcrgbhsqphvwmuejsptaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265672.1533458-826-226924240545083/AnsiballZ_setup.py'
Sep 30 20:54:32 compute-0 sudo[60934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:32 compute-0 python3.9[60936]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:54:33 compute-0 sudo[60934]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:33 compute-0 sudo[61018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzrjnahezofngjwgndophyzlszouikcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265672.1533458-826-226924240545083/AnsiballZ_systemd.py'
Sep 30 20:54:33 compute-0 sudo[61018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:33 compute-0 python3.9[61020]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:54:33 compute-0 chronyd[798]: chronyd exiting
Sep 30 20:54:33 compute-0 systemd[1]: Stopping NTP client/server...
Sep 30 20:54:33 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Sep 30 20:54:33 compute-0 systemd[1]: Stopped NTP client/server.
Sep 30 20:54:33 compute-0 systemd[1]: Starting NTP client/server...
Sep 30 20:54:33 compute-0 chronyd[61029]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 20:54:33 compute-0 chronyd[61029]: Frequency -24.718 +/- 0.253 ppm read from /var/lib/chrony/drift
Sep 30 20:54:33 compute-0 chronyd[61029]: Loaded seccomp filter (level 2)
Sep 30 20:54:33 compute-0 systemd[1]: Started NTP client/server.
Sep 30 20:54:33 compute-0 sudo[61018]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:34 compute-0 sshd-session[56324]: Connection closed by 192.168.122.30 port 46344
Sep 30 20:54:34 compute-0 sshd-session[56321]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:54:34 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Sep 30 20:54:34 compute-0 systemd[1]: session-13.scope: Consumed 27.766s CPU time.
Sep 30 20:54:34 compute-0 systemd-logind[792]: Session 13 logged out. Waiting for processes to exit.
Sep 30 20:54:34 compute-0 systemd-logind[792]: Removed session 13.
Sep 30 20:54:39 compute-0 sshd-session[61055]: Accepted publickey for zuul from 192.168.122.30 port 42498 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:54:39 compute-0 systemd-logind[792]: New session 14 of user zuul.
Sep 30 20:54:39 compute-0 systemd[1]: Started Session 14 of User zuul.
Sep 30 20:54:39 compute-0 sshd-session[61055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:54:40 compute-0 python3.9[61208]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:54:41 compute-0 sudo[61362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izmvlpnummhezfxvlaabjeldsvkdidai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265680.891125-64-52024130501862/AnsiballZ_file.py'
Sep 30 20:54:41 compute-0 sudo[61362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:41 compute-0 python3.9[61365]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:41 compute-0 sudo[61362]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:41 compute-0 sshd-session[61366]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 20:54:41 compute-0 sshd-session[61366]: Connection reset by 45.140.17.97 port 43063
Sep 30 20:54:42 compute-0 sudo[61540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igefjyjcszuucdpjlselyzyzrgrluilo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265681.8869328-88-40683967036205/AnsiballZ_stat.py'
Sep 30 20:54:42 compute-0 sudo[61540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:42 compute-0 python3.9[61542]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:42 compute-0 sudo[61540]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:42 compute-0 sudo[61619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpcojwqvtqohpobpshtmwvcwjbgdryvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265681.8869328-88-40683967036205/AnsiballZ_file.py'
Sep 30 20:54:42 compute-0 sudo[61619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:43 compute-0 python3.9[61621]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.6mnw9jid recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:43 compute-0 sudo[61619]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:44 compute-0 sudo[61771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-forkoazqfjqfuseousmiounrsxcimplz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265683.7449923-148-4430952700448/AnsiballZ_stat.py'
Sep 30 20:54:44 compute-0 sudo[61771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:44 compute-0 python3.9[61773]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:44 compute-0 sudo[61771]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:45 compute-0 sudo[61894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vstzcgzbpeucxislrexeeplbagpampci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265683.7449923-148-4430952700448/AnsiballZ_copy.py'
Sep 30 20:54:45 compute-0 sudo[61894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:45 compute-0 python3.9[61896]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265683.7449923-148-4430952700448/.source _original_basename=.ts5itqpx follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:45 compute-0 sudo[61894]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:45 compute-0 sudo[62046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnvtnnmpzwpovprokchkopyfzdwoscuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265685.6368096-196-143409259130676/AnsiballZ_file.py'
Sep 30 20:54:45 compute-0 sudo[62046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:46 compute-0 python3.9[62048]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:46 compute-0 sudo[62046]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:46 compute-0 sudo[62198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kksrphaehemkfyzioozzxcomewdhlyyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265686.497348-220-210607037008012/AnsiballZ_stat.py'
Sep 30 20:54:46 compute-0 sudo[62198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:47 compute-0 python3.9[62200]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:47 compute-0 sudo[62198]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:47 compute-0 sudo[62321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmxdjjdcrltvogsxhtmitgarsspvrciz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265686.497348-220-210607037008012/AnsiballZ_copy.py'
Sep 30 20:54:47 compute-0 sudo[62321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:47 compute-0 python3.9[62323]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265686.497348-220-210607037008012/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:47 compute-0 sudo[62321]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:48 compute-0 sudo[62473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwxervbnqsflmcaicxxspdxghbltdfov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265687.799364-220-82838295960078/AnsiballZ_stat.py'
Sep 30 20:54:48 compute-0 sudo[62473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:48 compute-0 python3.9[62475]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:48 compute-0 sudo[62473]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:48 compute-0 sudo[62596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahreeswiqilneghsksmevwskejtxqreq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265687.799364-220-82838295960078/AnsiballZ_copy.py'
Sep 30 20:54:48 compute-0 sudo[62596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:48 compute-0 python3.9[62598]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265687.799364-220-82838295960078/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:54:49 compute-0 sudo[62596]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:49 compute-0 sudo[62748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iifhzbmvdxggemdegjpizsfelezwfmty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265689.4822922-307-92459084175880/AnsiballZ_file.py'
Sep 30 20:54:49 compute-0 sudo[62748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:49 compute-0 python3.9[62750]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:49 compute-0 sudo[62748]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:50 compute-0 sudo[62900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgzvsacjiccgvxbhusvrodxbvucptihw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265690.4381316-331-163792740693216/AnsiballZ_stat.py'
Sep 30 20:54:50 compute-0 sudo[62900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:50 compute-0 python3.9[62902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:51 compute-0 sudo[62900]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:51 compute-0 sudo[63023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isbjhtmwuadbeizcizazymjwqwdrjpex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265690.4381316-331-163792740693216/AnsiballZ_copy.py'
Sep 30 20:54:51 compute-0 sudo[63023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:51 compute-0 python3.9[63025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265690.4381316-331-163792740693216/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:51 compute-0 sudo[63023]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:52 compute-0 sudo[63175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slndsxaqyvieimmqyqtdrxdcwlahilxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265691.942206-376-247009800208961/AnsiballZ_stat.py'
Sep 30 20:54:52 compute-0 sudo[63175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:52 compute-0 python3.9[63177]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:52 compute-0 sudo[63175]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:52 compute-0 sudo[63298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxabkemcfzjnnsporuxzyxpaehzpsgtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265691.942206-376-247009800208961/AnsiballZ_copy.py'
Sep 30 20:54:52 compute-0 sudo[63298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:53 compute-0 python3.9[63300]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265691.942206-376-247009800208961/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:53 compute-0 sudo[63298]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:54 compute-0 sudo[63450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgwexvqgrjnseznyqyqhcpfcxuwlxumu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265693.5157795-421-278538224865766/AnsiballZ_systemd.py'
Sep 30 20:54:54 compute-0 sudo[63450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:54 compute-0 python3.9[63452]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:54 compute-0 systemd[1]: Reloading.
Sep 30 20:54:54 compute-0 systemd-sysv-generator[63481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:54 compute-0 systemd-rc-local-generator[63478]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:54 compute-0 systemd[1]: Reloading.
Sep 30 20:54:54 compute-0 systemd-rc-local-generator[63521]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:54 compute-0 systemd-sysv-generator[63524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:55 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Sep 30 20:54:55 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Sep 30 20:54:55 compute-0 sudo[63450]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:55 compute-0 sudo[63678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripfdulkwyrqpwmuxpznwodeqsdennjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265695.4403856-445-170089565258189/AnsiballZ_stat.py'
Sep 30 20:54:55 compute-0 sudo[63678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:55 compute-0 python3.9[63680]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:55 compute-0 sudo[63678]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:56 compute-0 sudo[63801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvzzdtsihxcflqwbkiijcyimrjtwsulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265695.4403856-445-170089565258189/AnsiballZ_copy.py'
Sep 30 20:54:56 compute-0 sudo[63801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:56 compute-0 python3.9[63803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265695.4403856-445-170089565258189/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:56 compute-0 sudo[63801]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:57 compute-0 sudo[63953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnatosoxvpprowghxlhuawurzuwmogwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265696.8654733-490-215479464759508/AnsiballZ_stat.py'
Sep 30 20:54:57 compute-0 sudo[63953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:57 compute-0 python3.9[63955]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:54:57 compute-0 sudo[63953]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:57 compute-0 sudo[64076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmvmnhhrzzmcvdbelfhbblkyqnimqmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265696.8654733-490-215479464759508/AnsiballZ_copy.py'
Sep 30 20:54:57 compute-0 sudo[64076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:57 compute-0 python3.9[64078]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265696.8654733-490-215479464759508/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:54:57 compute-0 sudo[64076]: pam_unix(sudo:session): session closed for user root
Sep 30 20:54:58 compute-0 sudo[64228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdjibyyvdersvdmgguquaqclrcvndhuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265698.3079438-535-50921821811532/AnsiballZ_systemd.py'
Sep 30 20:54:58 compute-0 sudo[64228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:54:58 compute-0 python3.9[64230]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:54:58 compute-0 systemd[1]: Reloading.
Sep 30 20:54:59 compute-0 systemd-rc-local-generator[64254]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:59 compute-0 systemd-sysv-generator[64260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:59 compute-0 systemd[1]: Reloading.
Sep 30 20:54:59 compute-0 systemd-sysv-generator[64299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:54:59 compute-0 systemd-rc-local-generator[64293]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:54:59 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 20:54:59 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:54:59 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:54:59 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 20:54:59 compute-0 sudo[64228]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:00 compute-0 python3.9[64457]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:55:00 compute-0 network[64474]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:55:00 compute-0 network[64475]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:55:00 compute-0 network[64476]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:55:05 compute-0 sudo[64738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbxyqsxdgtdywnjovlnoncyzmzkwkcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265705.675356-583-20591946012078/AnsiballZ_systemd.py'
Sep 30 20:55:05 compute-0 sudo[64738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:06 compute-0 python3.9[64740]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:06 compute-0 systemd[1]: Reloading.
Sep 30 20:55:06 compute-0 systemd-rc-local-generator[64765]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:55:06 compute-0 systemd-sysv-generator[64771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:55:06 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Sep 30 20:55:06 compute-0 iptables.init[64780]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Sep 30 20:55:06 compute-0 iptables.init[64780]: iptables: Flushing firewall rules: [  OK  ]
Sep 30 20:55:06 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Sep 30 20:55:06 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Sep 30 20:55:06 compute-0 sudo[64738]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:07 compute-0 sudo[64974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmzkvoxddocujuceyxtkdysohyecfxcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265707.0754435-583-25949893660012/AnsiballZ_systemd.py'
Sep 30 20:55:07 compute-0 sudo[64974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:07 compute-0 python3.9[64976]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:07 compute-0 sudo[64974]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:08 compute-0 sudo[65128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxulzimttwouvsdshwzhwyrtxqhmypkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265708.397513-631-27628167308768/AnsiballZ_systemd.py'
Sep 30 20:55:08 compute-0 sudo[65128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:09 compute-0 python3.9[65130]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:55:09 compute-0 systemd[1]: Reloading.
Sep 30 20:55:09 compute-0 systemd-rc-local-generator[65160]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:55:09 compute-0 systemd-sysv-generator[65164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:55:09 compute-0 systemd[1]: Starting Netfilter Tables...
Sep 30 20:55:09 compute-0 systemd[1]: Finished Netfilter Tables.
Sep 30 20:55:09 compute-0 sudo[65128]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:10 compute-0 sudo[65320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fklgcyieoudnooosgilcjkiqibftrbcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265709.7721481-655-53586538096737/AnsiballZ_command.py'
Sep 30 20:55:10 compute-0 sudo[65320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:10 compute-0 python3.9[65322]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:55:10 compute-0 sudo[65320]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:11 compute-0 sudo[65473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okmtazmgqxtvtjcxnwndoylyoohafwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265711.137485-697-116003925198091/AnsiballZ_stat.py'
Sep 30 20:55:11 compute-0 sudo[65473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:11 compute-0 python3.9[65475]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:11 compute-0 sudo[65473]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:12 compute-0 sudo[65598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzfcjwggotfeofdwjkcqmawinkngclbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265711.137485-697-116003925198091/AnsiballZ_copy.py'
Sep 30 20:55:12 compute-0 sudo[65598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:12 compute-0 python3.9[65600]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265711.137485-697-116003925198091/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:12 compute-0 sudo[65598]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:13 compute-0 python3.9[65751]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:55:13 compute-0 polkitd[6535]: Registered Authentication Agent for unix-process:65753:240910 (system bus name :1.550 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Sep 30 20:55:38 compute-0 polkitd[6535]: Unregistered Authentication Agent for unix-process:65753:240910 (system bus name :1.550, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Sep 30 20:55:38 compute-0 polkit-agent-helper-1[65765]: pam_unix(polkit-1:auth): conversation failed
Sep 30 20:55:38 compute-0 polkit-agent-helper-1[65765]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Sep 30 20:55:38 compute-0 polkitd[6535]: Operator of unix-process:65753:240910 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.549 [<unknown>] (owned by unix-user:zuul)
Sep 30 20:55:39 compute-0 sshd-session[61058]: Connection closed by 192.168.122.30 port 42498
Sep 30 20:55:39 compute-0 sshd-session[61055]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:55:39 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Sep 30 20:55:39 compute-0 systemd[1]: session-14.scope: Consumed 21.125s CPU time.
Sep 30 20:55:39 compute-0 systemd-logind[792]: Session 14 logged out. Waiting for processes to exit.
Sep 30 20:55:39 compute-0 systemd-logind[792]: Removed session 14.
Sep 30 20:55:51 compute-0 sshd-session[65791]: Accepted publickey for zuul from 192.168.122.30 port 59094 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:55:51 compute-0 systemd-logind[792]: New session 15 of user zuul.
Sep 30 20:55:51 compute-0 systemd[1]: Started Session 15 of User zuul.
Sep 30 20:55:51 compute-0 sshd-session[65791]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:55:52 compute-0 python3.9[65944]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:55:54 compute-0 sudo[66098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzkfxzpotawhaejsydoialqehhfyeytf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265753.5695255-64-116760594105321/AnsiballZ_file.py'
Sep 30 20:55:54 compute-0 sudo[66098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:54 compute-0 python3.9[66100]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:54 compute-0 sudo[66098]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:55 compute-0 sudo[66273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzmifnffbxduhnhmawjeciavunfslwmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265754.5736876-88-246056688966134/AnsiballZ_stat.py'
Sep 30 20:55:55 compute-0 sudo[66273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:55 compute-0 python3.9[66275]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:55 compute-0 sudo[66273]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:55 compute-0 sudo[66351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbkqnhswbpnhpootqjzobbcdvmnslwwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265754.5736876-88-246056688966134/AnsiballZ_file.py'
Sep 30 20:55:55 compute-0 sudo[66351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:55 compute-0 python3.9[66353]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.jox4nzrx recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:55 compute-0 sudo[66351]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:56 compute-0 sudo[66503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmtlzirwyxkujcewfivyigwotybrwtci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265756.3669307-148-3642920135985/AnsiballZ_stat.py'
Sep 30 20:55:56 compute-0 sudo[66503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:56 compute-0 python3.9[66505]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:56 compute-0 sudo[66503]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:57 compute-0 sudo[66581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnouctveogkysuzvhdqltwfequlllvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265756.3669307-148-3642920135985/AnsiballZ_file.py'
Sep 30 20:55:57 compute-0 sudo[66581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:57 compute-0 python3.9[66583]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.z9_8z9yu recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:55:57 compute-0 sudo[66581]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:58 compute-0 sudo[66733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpxcdpehrfmlfoycynqpbqvirqewyoxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265757.7850847-187-257414615029125/AnsiballZ_file.py'
Sep 30 20:55:58 compute-0 sudo[66733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:58 compute-0 python3.9[66735]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:55:58 compute-0 sudo[66733]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:58 compute-0 sudo[66885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlbgcnkecvuoeqtefhsuvqraiglssyyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265758.613991-211-105382367801210/AnsiballZ_stat.py'
Sep 30 20:55:58 compute-0 sudo[66885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:59 compute-0 python3.9[66887]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:55:59 compute-0 sudo[66885]: pam_unix(sudo:session): session closed for user root
Sep 30 20:55:59 compute-0 sudo[66963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntszxqrvuyimzchkyabezotteksrxpis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265758.613991-211-105382367801210/AnsiballZ_file.py'
Sep 30 20:55:59 compute-0 sudo[66963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:55:59 compute-0 python3.9[66965]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:55:59 compute-0 sudo[66963]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:00 compute-0 sudo[67115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hygpuvovstkrgmdmewkcpvpqvriqxqnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265759.9412785-211-36636284243697/AnsiballZ_stat.py'
Sep 30 20:56:00 compute-0 sudo[67115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:00 compute-0 python3.9[67117]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:00 compute-0 sudo[67115]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:00 compute-0 sudo[67193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooufhxnvqoocmuwucbvwivrnbytahaia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265759.9412785-211-36636284243697/AnsiballZ_file.py'
Sep 30 20:56:00 compute-0 sudo[67193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:00 compute-0 python3.9[67195]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:56:00 compute-0 sudo[67193]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:01 compute-0 sudo[67345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgrzxjehnnzjteygobhvvwittzyufwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265761.1929083-280-13416016759444/AnsiballZ_file.py'
Sep 30 20:56:01 compute-0 sudo[67345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:01 compute-0 python3.9[67347]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:01 compute-0 sudo[67345]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:02 compute-0 sudo[67497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbvpeaziszgufattoasmlqzleydshnxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265762.124969-304-269259508700820/AnsiballZ_stat.py'
Sep 30 20:56:02 compute-0 sudo[67497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:02 compute-0 python3.9[67499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:02 compute-0 sudo[67497]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:02 compute-0 sudo[67575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvqaatwywucoeehhithkyqabbkodoxyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265762.124969-304-269259508700820/AnsiballZ_file.py'
Sep 30 20:56:02 compute-0 sudo[67575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:03 compute-0 python3.9[67577]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:03 compute-0 sudo[67575]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:03 compute-0 sudo[67727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfnqzmuuphwupjhlxbuwpgjzkvpqgbbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265763.408709-340-234004124971454/AnsiballZ_stat.py'
Sep 30 20:56:03 compute-0 sudo[67727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:03 compute-0 python3.9[67729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:03 compute-0 sudo[67727]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:04 compute-0 sudo[67805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbclczznvclzkhemgfmxqvfsnhlgtcbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265763.408709-340-234004124971454/AnsiballZ_file.py'
Sep 30 20:56:04 compute-0 sudo[67805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:04 compute-0 python3.9[67807]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:04 compute-0 sudo[67805]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:05 compute-0 sudo[67957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaolzdbqqmryqrcktfprcjoaogpobmkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265764.6608832-376-96736432448911/AnsiballZ_systemd.py'
Sep 30 20:56:05 compute-0 sudo[67957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:05 compute-0 python3.9[67959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:56:05 compute-0 systemd[1]: Reloading.
Sep 30 20:56:05 compute-0 systemd-rc-local-generator[67987]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:56:05 compute-0 systemd-sysv-generator[67991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:56:06 compute-0 sudo[67957]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:06 compute-0 sudo[68146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sayoarhfecpoeqedbkhpponhfysbihlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265766.3251662-400-279360833426199/AnsiballZ_stat.py'
Sep 30 20:56:06 compute-0 sudo[68146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:06 compute-0 python3.9[68148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:06 compute-0 sudo[68146]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:07 compute-0 sudo[68224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwwduwbupkhwkytnmegsqtgyqipzljch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265766.3251662-400-279360833426199/AnsiballZ_file.py'
Sep 30 20:56:07 compute-0 sudo[68224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:07 compute-0 python3.9[68226]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:07 compute-0 sudo[68224]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:08 compute-0 sudo[68376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evofuzynwqchwuzqcluznokmdmuxrovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265767.8668504-436-197297161491870/AnsiballZ_stat.py'
Sep 30 20:56:08 compute-0 sudo[68376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:08 compute-0 python3.9[68378]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:08 compute-0 sudo[68376]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:08 compute-0 sudo[68454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avvekxtgixzzjxzcmycykxwvoveosuxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265767.8668504-436-197297161491870/AnsiballZ_file.py'
Sep 30 20:56:08 compute-0 sudo[68454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:08 compute-0 python3.9[68456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:08 compute-0 sudo[68454]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:09 compute-0 sudo[68606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxjwlhyjnndrqhojyiuiqpjwlesbauyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265769.2541606-472-247957592702462/AnsiballZ_systemd.py'
Sep 30 20:56:09 compute-0 sudo[68606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:09 compute-0 python3.9[68608]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:56:09 compute-0 systemd[1]: Reloading.
Sep 30 20:56:10 compute-0 systemd-sysv-generator[68637]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:56:10 compute-0 systemd-rc-local-generator[68633]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:56:10 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 20:56:10 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:56:10 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:56:10 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 20:56:10 compute-0 sudo[68606]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:11 compute-0 python3.9[68800]: ansible-ansible.builtin.service_facts Invoked
Sep 30 20:56:11 compute-0 network[68817]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 20:56:11 compute-0 network[68818]: 'network-scripts' will be removed from distribution in near future.
Sep 30 20:56:11 compute-0 network[68819]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 20:56:17 compute-0 sudo[69080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daubxtzlzyebfkkvdyswsbagijveozpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265776.7682426-550-280019101034562/AnsiballZ_stat.py'
Sep 30 20:56:17 compute-0 sudo[69080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:17 compute-0 python3.9[69082]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:17 compute-0 sudo[69080]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:17 compute-0 sudo[69158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmfkzjdmfzagtlrghodefilsmvwfvhzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265776.7682426-550-280019101034562/AnsiballZ_file.py'
Sep 30 20:56:17 compute-0 sudo[69158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:17 compute-0 python3.9[69160]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:17 compute-0 sudo[69158]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:18 compute-0 sudo[69310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phsofzmbnjjntozknjcynbhzhxqwwncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265778.2894032-589-220257490719694/AnsiballZ_file.py'
Sep 30 20:56:18 compute-0 sudo[69310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:18 compute-0 python3.9[69312]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:18 compute-0 sudo[69310]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:19 compute-0 sudo[69462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfcdtkzooqlqnhwgyaipvcpcfdgjgbhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265779.087269-613-97753627510595/AnsiballZ_stat.py'
Sep 30 20:56:19 compute-0 sudo[69462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:19 compute-0 python3.9[69464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:19 compute-0 sudo[69462]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:20 compute-0 sudo[69585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eseanuksbkjzwedxpjrexwgbdbtayobf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265779.087269-613-97753627510595/AnsiballZ_copy.py'
Sep 30 20:56:20 compute-0 sudo[69585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:20 compute-0 python3.9[69587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265779.087269-613-97753627510595/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:20 compute-0 sudo[69585]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:21 compute-0 sudo[69737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwousqdrzshnyocwbjqauuedgiuflns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265780.7939856-667-122001310468704/AnsiballZ_timezone.py'
Sep 30 20:56:21 compute-0 sudo[69737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:21 compute-0 python3.9[69739]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 20:56:21 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 20:56:21 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 20:56:21 compute-0 sudo[69737]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:22 compute-0 sudo[69893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deaqtwvzsbylcriejlkpiwycrrzqwgwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265782.1448748-694-275225232161175/AnsiballZ_file.py'
Sep 30 20:56:22 compute-0 sudo[69893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:22 compute-0 python3.9[69895]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:22 compute-0 sudo[69893]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:23 compute-0 sudo[70045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsgwhcnsnbajxufqqnfkyhbmhoqspjov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265783.0050828-718-190153359598002/AnsiballZ_stat.py'
Sep 30 20:56:23 compute-0 sudo[70045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:23 compute-0 python3.9[70047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:23 compute-0 sudo[70045]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:24 compute-0 sudo[70168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiggequclfgerzvqmtnvlvwrhkieddjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265783.0050828-718-190153359598002/AnsiballZ_copy.py'
Sep 30 20:56:24 compute-0 sudo[70168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:24 compute-0 python3.9[70170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265783.0050828-718-190153359598002/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:24 compute-0 sudo[70168]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:25 compute-0 sudo[70320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysxnugldnkhkovufdexocbazpnrskqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265784.7473662-763-135359981369589/AnsiballZ_stat.py'
Sep 30 20:56:25 compute-0 sudo[70320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:25 compute-0 python3.9[70322]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:25 compute-0 sudo[70320]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:25 compute-0 sudo[70443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtckxajdrwmrrwvbhrevqxrokrwdkpwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265784.7473662-763-135359981369589/AnsiballZ_copy.py'
Sep 30 20:56:25 compute-0 sudo[70443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:25 compute-0 python3.9[70445]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265784.7473662-763-135359981369589/.source.yaml _original_basename=.1t5gv10q follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:25 compute-0 sudo[70443]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:26 compute-0 sudo[70595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovnhfijpisfwhpwznxosvlpcmomhdrln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265786.3491893-808-130816965385348/AnsiballZ_stat.py'
Sep 30 20:56:26 compute-0 sudo[70595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:26 compute-0 python3.9[70597]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:26 compute-0 sudo[70595]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:27 compute-0 sudo[70718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnanjewmvvwoiihgnjtiudbknhuhpddm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265786.3491893-808-130816965385348/AnsiballZ_copy.py'
Sep 30 20:56:27 compute-0 sudo[70718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:27 compute-0 python3.9[70720]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265786.3491893-808-130816965385348/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:27 compute-0 sudo[70718]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:28 compute-0 sudo[70870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efhyqmcjklxqgmftxjuemgemiwqdxtio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265787.9970112-853-80582128063276/AnsiballZ_command.py'
Sep 30 20:56:28 compute-0 sudo[70870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:28 compute-0 python3.9[70872]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:28 compute-0 sudo[70870]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:29 compute-0 sudo[71023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkvoisnfmfvfjvoiewunycmbytkmpoaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265788.934443-877-211706009215902/AnsiballZ_command.py'
Sep 30 20:56:29 compute-0 sudo[71023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:29 compute-0 python3.9[71025]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:29 compute-0 sudo[71023]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:30 compute-0 sudo[71176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofabhbrpyggrabbslycxgyqdqtajyzfr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265789.977555-901-259964059878480/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 20:56:30 compute-0 sudo[71176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:30 compute-0 python3[71178]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 20:56:30 compute-0 sudo[71176]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:31 compute-0 sudo[71328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twxjeubjhzkryzytmqvxnkhfluvktdbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265791.1624985-925-238711173727140/AnsiballZ_stat.py'
Sep 30 20:56:31 compute-0 sudo[71328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:31 compute-0 python3.9[71330]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:31 compute-0 sudo[71328]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:32 compute-0 sudo[71451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ermolhlojueyiyekwejoaliflccdrliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265791.1624985-925-238711173727140/AnsiballZ_copy.py'
Sep 30 20:56:32 compute-0 sudo[71451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:32 compute-0 python3.9[71453]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265791.1624985-925-238711173727140/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:32 compute-0 sudo[71451]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:33 compute-0 sudo[71603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aunqlcvfrjfatjrlsfluhvrcqsolygoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265792.6915078-970-131001955192824/AnsiballZ_stat.py'
Sep 30 20:56:33 compute-0 sudo[71603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:33 compute-0 python3.9[71605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:33 compute-0 sudo[71603]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:33 compute-0 sudo[71726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdowbmkvoazvdwvaryvzxgnmaoqkfoff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265792.6915078-970-131001955192824/AnsiballZ_copy.py'
Sep 30 20:56:33 compute-0 sudo[71726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:33 compute-0 python3.9[71728]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265792.6915078-970-131001955192824/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:33 compute-0 sudo[71726]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:34 compute-0 sudo[71878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxkxqrkwqbwgyzrmwghyakiqdrnkkxfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265794.2379985-1015-245785215210911/AnsiballZ_stat.py'
Sep 30 20:56:34 compute-0 sudo[71878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:34 compute-0 python3.9[71880]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:34 compute-0 sudo[71878]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:35 compute-0 sudo[72001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsofrrkfqshphrbjkrwszdzsgychfpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265794.2379985-1015-245785215210911/AnsiballZ_copy.py'
Sep 30 20:56:35 compute-0 sudo[72001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:35 compute-0 python3.9[72003]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265794.2379985-1015-245785215210911/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:35 compute-0 sudo[72001]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:36 compute-0 sudo[72155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbqoyrzqqfhizgqijknyolcnnwhadoqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265796.0235007-1060-159033001975519/AnsiballZ_stat.py'
Sep 30 20:56:36 compute-0 sudo[72155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:36 compute-0 sshd-session[72004]: Invalid user  from 47.86.37.20 port 57552
Sep 30 20:56:36 compute-0 python3.9[72157]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:36 compute-0 sudo[72155]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:36 compute-0 sudo[72278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkyvigwycsinpayrnhckdxraoukhbady ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265796.0235007-1060-159033001975519/AnsiballZ_copy.py'
Sep 30 20:56:36 compute-0 sudo[72278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:37 compute-0 python3.9[72280]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265796.0235007-1060-159033001975519/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:37 compute-0 sudo[72278]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:38 compute-0 sudo[72430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgkvoqaavtjydeiugxdrbuzqnwlajlva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265797.6225042-1105-196477062286753/AnsiballZ_stat.py'
Sep 30 20:56:38 compute-0 sudo[72430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:38 compute-0 python3.9[72432]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:56:38 compute-0 sudo[72430]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:38 compute-0 sudo[72553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dthgjdwzlguyaokewtwgjkhhwgnkqija ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265797.6225042-1105-196477062286753/AnsiballZ_copy.py'
Sep 30 20:56:38 compute-0 sudo[72553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:38 compute-0 python3.9[72555]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265797.6225042-1105-196477062286753/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:38 compute-0 sudo[72553]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:39 compute-0 sudo[72705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbtryybjnjfqmhcbllkwfdxncdgzlkaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265799.2044826-1150-77712833128448/AnsiballZ_file.py'
Sep 30 20:56:39 compute-0 sudo[72705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:39 compute-0 python3.9[72707]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:39 compute-0 sudo[72705]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:40 compute-0 sudo[72857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czgcibtksimkultpuztyvugqejhwzsxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265800.094973-1174-270167043383890/AnsiballZ_command.py'
Sep 30 20:56:40 compute-0 sudo[72857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:40 compute-0 python3.9[72859]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:40 compute-0 sudo[72857]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:41 compute-0 sudo[73016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfmnsuihphzpjxcwdhhdfvoohexbpbji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265801.0726197-1198-249581218016071/AnsiballZ_blockinfile.py'
Sep 30 20:56:41 compute-0 sudo[73016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:41 compute-0 python3.9[73018]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:41 compute-0 sudo[73016]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:42 compute-0 sudo[73169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rshqalhwrbruiqgvquijmryheeyblxup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265802.1987474-1225-143667999260550/AnsiballZ_file.py'
Sep 30 20:56:42 compute-0 sudo[73169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:42 compute-0 python3.9[73171]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:42 compute-0 sudo[73169]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:43 compute-0 sudo[73321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qekburxcgsdsqcuqkooxhedaqqowyxop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265802.9964411-1225-172004109005315/AnsiballZ_file.py'
Sep 30 20:56:43 compute-0 sudo[73321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:43 compute-0 python3.9[73323]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:43 compute-0 sudo[73321]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:43 compute-0 sshd-session[72004]: Connection closed by invalid user  47.86.37.20 port 57552 [preauth]
Sep 30 20:56:43 compute-0 chronyd[61029]: Selected source 162.159.200.1 (pool.ntp.org)
Sep 30 20:56:44 compute-0 sudo[73473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyexsfyojbgisligwonhihprvxbuccb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265803.8031976-1270-130050910866467/AnsiballZ_mount.py'
Sep 30 20:56:44 compute-0 sudo[73473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:44 compute-0 python3.9[73475]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 20:56:44 compute-0 sudo[73473]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:56:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 20:56:45 compute-0 sudo[73628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aghwftssiivtguhquhvieevcqycuqzhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265804.7591438-1270-216173941226819/AnsiballZ_mount.py'
Sep 30 20:56:45 compute-0 sudo[73628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:45 compute-0 python3.9[73630]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 20:56:45 compute-0 sudo[73628]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:46 compute-0 sshd-session[65794]: Connection closed by 192.168.122.30 port 59094
Sep 30 20:56:46 compute-0 sshd-session[65791]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:56:46 compute-0 systemd-logind[792]: Session 15 logged out. Waiting for processes to exit.
Sep 30 20:56:46 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Sep 30 20:56:46 compute-0 systemd[1]: session-15.scope: Consumed 34.952s CPU time.
Sep 30 20:56:46 compute-0 systemd-logind[792]: Removed session 15.
Sep 30 20:56:46 compute-0 sshd[1008]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 61489
Sep 30 20:56:51 compute-0 sshd-session[73656]: Accepted publickey for zuul from 192.168.122.30 port 56312 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:56:51 compute-0 systemd-logind[792]: New session 16 of user zuul.
Sep 30 20:56:51 compute-0 systemd[1]: Started Session 16 of User zuul.
Sep 30 20:56:51 compute-0 sshd-session[73656]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:56:51 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 20:56:51 compute-0 sudo[73812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihfiiibqppnztmmlyyukdrbhkrnafbxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265811.3625321-23-151401579677617/AnsiballZ_tempfile.py'
Sep 30 20:56:51 compute-0 sudo[73812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:52 compute-0 python3.9[73814]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 20:56:52 compute-0 sudo[73812]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:52 compute-0 sudo[73964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbitohommjgbbvmxmjjipadlbzhuzoix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265812.4130259-59-187618128252159/AnsiballZ_stat.py'
Sep 30 20:56:52 compute-0 sudo[73964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:53 compute-0 python3.9[73966]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:56:53 compute-0 sudo[73964]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:54 compute-0 sudo[74116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myjdzigioyuboscbwepdmezicrescqzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265813.5127592-89-144657999193920/AnsiballZ_setup.py'
Sep 30 20:56:54 compute-0 sudo[74116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:54 compute-0 python3.9[74118]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:56:54 compute-0 sudo[74116]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:55 compute-0 sudo[74268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkcviadlbraxmhnrlymyphbitwwbiyyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265815.1650765-114-78084488250270/AnsiballZ_blockinfile.py'
Sep 30 20:56:55 compute-0 sudo[74268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:55 compute-0 python3.9[74270]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCwbguriNnKe1J2ZDq7gor8K7Q+lfGJj0jSgjJsKEAo86u38uv5C6xHzMNeAjrvv5lEpFfzr0c3it3JW+dozBbfDykJ0wkEOeR0LHleEuFWTmPiG4RPVZ1m1J78yNSFWhD5VykkBwAqirHFrpywQIhDpw77JTQCi/xxeNvkj+vXx01l2nVO8cvDgHufzI+lR13XAFYs2zFc66/eOej+HaLiLK9IJQxGbnRa6+QZHQ/W2ou3kUAJYqnXBI8i1n0DyPWcfnDiKxmNlhvOFCVrRYKldToQO2oDNq7UFG3hUDzhT6NKABZTCQ/V/AXTwChfyV+kOR3+MOcdbvsymKiU5eIqWfFljY8pnxR//zYoaSfKNoi6fCUnilUdrEMOEzqDDgVhnS7ml2uz9JTRHuHbN8onUffLvBa7V/nLvlcv6WSbHHiQ/+janZ294/vCQuDTupMIAIvWj8cWHV3qv/Y/quONpe+lDn2/vxourJmj+K/mlEnSWCs49ztenPV15oUk51c=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHhbJv8lx2yVUO6nnQVlESK5ivpJT0r/PkYxvcd6qUmW
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH6lp5638PMrvExSWZKHaxHkYDLYTl4v6jLIL5XRhmXjHAbQo8UkrhUVofrCF0y9aTPCQ4m6QhJ9ntO/Pb9rWgE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCdFkN8U9xxznTs4l1dWKSgqtKSYoqjAq8N76IdRfa7xWuGXIDRX+Z/XS7MRDpugTVs4N689TRx9Piour7pbLLbVTdHtKAVY1ZFbO52TOv/Ya6j0Royn16s8ReajdozIptjKmHy9G2FpdOX7C4Y20cMciRCgKF+Uk1cb0iX7vYYWIprMI2dgvdoP3rAj0PkVPXji+oCGf4tEApwuWhji+GnWIVl/vFVGzg0S/OILQhkMHPBMMFdTA5/Xg4Z/liXQoQ4zDyYzjvYLXLSky6ySf/RJ58ny5ps0tRkbkvC0rwt37FhZbmlGFIZg33S799Zjt5rvLF5JSwuXGUsXu8EJQBHo5+QJwXxOcVlBJtbEkk/A5ashHxFcql24pTE/TJfJyvpvM3rRhZR8I/8DmYiyhblB27IxHhXGGeoQ17NZFuGnTwfGeyShJ642Bm+i/bAWgoanfixF1edEObUsck8KEpBwun3G/SLIba+hLXnGRGtjbEnsn7rANIXVoHUeM4xMEE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEj+XpmwzMgMg+VNuVHNqnvVSOmbrJ0iinPB93cL5gcO
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGEM3NBdfKRXE3PLB677JgGmO3w3HhbkbYxeBS7PkJCAqAqklzpLc5E0r4ovcfzPiQaR/ONvG1z+RYgVwf+jfWU=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCWm45T0rx8eEGSOQf1IMie1aUJ+iwjTlwiWrmoGXyOX+gHYFcAwEzYr7tsCLrK7pQ2HBdBK6ZtTkFBMiRtZrZIzZ9bSytf3lXlayZia2khpG4ghK/3E9JZ4ThQmEAGxzPaFT+MCqKmmeWpsp5RQ8atdu+RkIEt95H8nJyBQYl2J9/PauZdleaGqWV7ah8ftqHvSfMtzljAlJqsazcPIq+1WteG18MMZGKoaGbNluITShBILFneVDfzdgT+BfoMOI3UgEO5EEOsf0+VW6Hd9nL3myviOEWD6FPOGUD0eofeXmCI94vefLVl7jMnyd2iEeNjhIE1lB70kyyQCvqvRgfmWJ7fEyeebIJ0y4YrfVLENILH2N1Q7OFvYZHxGEZFAtyPkWKsaHdPUlFCD7VsEK1NQMkLsmmRzp8umfqerUlpAerD1JkAv4kjRita1IOu1/qy562Ohzwhlfdc4zUprZwVOWtsETDD7Wvu4H5YKJQnxzFlWVeemsl8popivMpCaX0=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIM8lvg/9vtwYg/yl4PrNQ2IuRQZaYQt0XQPWZcbyMzH9
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD+C/rnBvt2r65Gl0M5vYcnCxgN/0t0Q/XcUf4UnaG+S2BBadWzDkctg8AqKsRNiacbXLRPVhzNMBUwp9JQsW5s=
                                             create=True mode=0644 path=/tmp/ansible.g2xhv7tj state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:55 compute-0 sudo[74268]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:56 compute-0 sudo[74420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfmqicfgjoafauyppprmjauobcddvpcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265816.3121583-138-268993551419411/AnsiballZ_command.py'
Sep 30 20:56:56 compute-0 sudo[74420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:57 compute-0 python3.9[74422]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.g2xhv7tj' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:56:57 compute-0 sshd[1008]: drop connection #0 from [49.64.169.153]:36717 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 20:56:57 compute-0 sudo[74420]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:57 compute-0 sudo[74574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yttquvhxmudbniuhhnrttzrqrpzlhmns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265817.400292-162-124200815285880/AnsiballZ_file.py'
Sep 30 20:56:57 compute-0 sudo[74574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:56:58 compute-0 python3.9[74576]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.g2xhv7tj state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:56:58 compute-0 sudo[74574]: pam_unix(sudo:session): session closed for user root
Sep 30 20:56:58 compute-0 sshd-session[73659]: Connection closed by 192.168.122.30 port 56312
Sep 30 20:56:58 compute-0 sshd-session[73656]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:56:58 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Sep 30 20:56:58 compute-0 systemd[1]: session-16.scope: Consumed 3.686s CPU time.
Sep 30 20:56:58 compute-0 systemd-logind[792]: Session 16 logged out. Waiting for processes to exit.
Sep 30 20:56:58 compute-0 systemd-logind[792]: Removed session 16.
Sep 30 20:57:03 compute-0 sshd-session[74601]: Accepted publickey for zuul from 192.168.122.30 port 40622 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:03 compute-0 systemd-logind[792]: New session 17 of user zuul.
Sep 30 20:57:03 compute-0 systemd[1]: Started Session 17 of User zuul.
Sep 30 20:57:03 compute-0 sshd-session[74601]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:04 compute-0 python3.9[74754]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:05 compute-0 sudo[74908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivifdljtddedpzitwlyrzrojxdzatswb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265825.299848-61-52672060388486/AnsiballZ_systemd.py'
Sep 30 20:57:05 compute-0 sudo[74908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:06 compute-0 python3.9[74910]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 20:57:06 compute-0 sudo[74908]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:06 compute-0 sudo[75062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjdmvtqaopxapwzymiwgkvgjudwicayw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265826.577979-85-19833327900285/AnsiballZ_systemd.py'
Sep 30 20:57:06 compute-0 sudo[75062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:07 compute-0 python3.9[75064]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 20:57:07 compute-0 sudo[75062]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:08 compute-0 sudo[75215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohhwsbbkutwsdrqxhproaedwqclqczuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265827.6356647-112-246910327279154/AnsiballZ_command.py'
Sep 30 20:57:08 compute-0 sudo[75215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:08 compute-0 python3.9[75217]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:08 compute-0 sudo[75215]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:09 compute-0 sudo[75368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfqibhkkohffuemhqyqyazirymuralys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265828.614125-136-125508179970319/AnsiballZ_stat.py'
Sep 30 20:57:09 compute-0 sudo[75368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:09 compute-0 python3.9[75370]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:09 compute-0 sudo[75368]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:10 compute-0 sudo[75522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvpdgbawpxjazmosdibvwglkplmuycc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265829.634365-160-181494118571960/AnsiballZ_command.py'
Sep 30 20:57:10 compute-0 sudo[75522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:10 compute-0 python3.9[75524]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:10 compute-0 sudo[75522]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:10 compute-0 sudo[75677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxjsfixirpfbbgmqmnusunbiuybdwvup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265830.4724338-184-158691967050178/AnsiballZ_file.py'
Sep 30 20:57:10 compute-0 sudo[75677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:11 compute-0 python3.9[75679]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:11 compute-0 sudo[75677]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:11 compute-0 sshd-session[74604]: Connection closed by 192.168.122.30 port 40622
Sep 30 20:57:11 compute-0 sshd-session[74601]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:57:11 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Sep 30 20:57:11 compute-0 systemd[1]: session-17.scope: Consumed 5.044s CPU time.
Sep 30 20:57:11 compute-0 systemd-logind[792]: Session 17 logged out. Waiting for processes to exit.
Sep 30 20:57:11 compute-0 systemd-logind[792]: Removed session 17.
Sep 30 20:57:16 compute-0 sshd-session[75704]: Accepted publickey for zuul from 192.168.122.30 port 47484 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:16 compute-0 systemd-logind[792]: New session 18 of user zuul.
Sep 30 20:57:16 compute-0 systemd[1]: Started Session 18 of User zuul.
Sep 30 20:57:16 compute-0 sshd-session[75704]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:17 compute-0 python3.9[75857]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:18 compute-0 sudo[76011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfssmxmdbzjilnyzejfcvixwfbsswdyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265838.2705293-67-39745503881780/AnsiballZ_setup.py'
Sep 30 20:57:18 compute-0 sudo[76011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:18 compute-0 python3.9[76013]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:57:19 compute-0 sudo[76011]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:19 compute-0 sudo[76095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbhultcevyuedffuikyaxheqnttjfqvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265838.2705293-67-39745503881780/AnsiballZ_dnf.py'
Sep 30 20:57:19 compute-0 sudo[76095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:19 compute-0 python3.9[76097]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 20:57:20 compute-0 sudo[76095]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:21 compute-0 python3.9[76248]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:57:23 compute-0 python3.9[76399]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 20:57:24 compute-0 python3.9[76549]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:24 compute-0 python3.9[76699]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:57:25 compute-0 sshd-session[75707]: Connection closed by 192.168.122.30 port 47484
Sep 30 20:57:25 compute-0 sshd-session[75704]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:57:25 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Sep 30 20:57:25 compute-0 systemd[1]: session-18.scope: Consumed 5.890s CPU time.
Sep 30 20:57:25 compute-0 systemd-logind[792]: Session 18 logged out. Waiting for processes to exit.
Sep 30 20:57:25 compute-0 systemd-logind[792]: Removed session 18.
Sep 30 20:57:30 compute-0 sshd-session[76724]: Accepted publickey for zuul from 192.168.122.30 port 40216 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:57:30 compute-0 systemd-logind[792]: New session 19 of user zuul.
Sep 30 20:57:30 compute-0 systemd[1]: Started Session 19 of User zuul.
Sep 30 20:57:30 compute-0 sshd-session[76724]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:57:31 compute-0 python3.9[76877]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:57:33 compute-0 sudo[77031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cutrzvdpbpotgwnxbmpkguavoecpzepd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265852.9672368-116-117078391552035/AnsiballZ_file.py'
Sep 30 20:57:33 compute-0 sudo[77031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:33 compute-0 python3.9[77033]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:33 compute-0 sudo[77031]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:34 compute-0 sudo[77183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brlvnowwoenwavgqccwuxqqcoywrvrqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265853.826023-116-255595053036286/AnsiballZ_file.py'
Sep 30 20:57:34 compute-0 sudo[77183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:34 compute-0 python3.9[77185]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:34 compute-0 sudo[77183]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:35 compute-0 sudo[77335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afzsgcbohqftvlverrwxvnfiatcfmujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265854.622516-160-278713678355673/AnsiballZ_stat.py'
Sep 30 20:57:35 compute-0 sudo[77335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:35 compute-0 python3.9[77337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:35 compute-0 sudo[77335]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:35 compute-0 sudo[77458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzxjmoithqypqdmsaofrgzyetqdlmvkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265854.622516-160-278713678355673/AnsiballZ_copy.py'
Sep 30 20:57:35 compute-0 sudo[77458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:36 compute-0 python3.9[77460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265854.622516-160-278713678355673/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=10eb40c79e76188470ca2671f9498cd7619f7513 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:36 compute-0 sudo[77458]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:36 compute-0 sudo[77610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmklmdzjignyrfxusewsqbzlitykrjzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265856.2227826-160-115566838235364/AnsiballZ_stat.py'
Sep 30 20:57:36 compute-0 sudo[77610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:36 compute-0 python3.9[77612]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:36 compute-0 sudo[77610]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:37 compute-0 sudo[77733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycjhsdquifsatwckmwnuxozlowpltgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265856.2227826-160-115566838235364/AnsiballZ_copy.py'
Sep 30 20:57:37 compute-0 sudo[77733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:37 compute-0 python3.9[77735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265856.2227826-160-115566838235364/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=32428e3eac4ecab15356e47557337caf0347c55c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:37 compute-0 sudo[77733]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:37 compute-0 sudo[77885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aooouthuolltjxtcsqjeiihdpbubfrgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265857.6347766-160-5223672827987/AnsiballZ_stat.py'
Sep 30 20:57:37 compute-0 sudo[77885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:38 compute-0 python3.9[77887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:38 compute-0 sudo[77885]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:38 compute-0 sudo[78008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcxjqzdywmkudhxuxjjjsaheuqeyjmrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265857.6347766-160-5223672827987/AnsiballZ_copy.py'
Sep 30 20:57:38 compute-0 sudo[78008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:38 compute-0 python3.9[78010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265857.6347766-160-5223672827987/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3dd5970cf41a89890a9f466d9c4491a59aa5b38c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:38 compute-0 sudo[78008]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:39 compute-0 sudo[78160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddbqwbhbpdxjbvbvejxqryhdetqufwjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265859.0163512-296-196091335976352/AnsiballZ_file.py'
Sep 30 20:57:39 compute-0 sudo[78160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:39 compute-0 python3.9[78162]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:39 compute-0 sudo[78160]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:40 compute-0 sudo[78312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcsepsxoldctgzdjndzgbshofpnaklyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265859.7178576-296-95927703882687/AnsiballZ_file.py'
Sep 30 20:57:40 compute-0 sudo[78312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:40 compute-0 python3.9[78314]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:40 compute-0 sudo[78312]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:40 compute-0 sudo[78464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipctrlirzvnlobrfyathxptmgouulheo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265860.5071177-342-6437581998897/AnsiballZ_stat.py'
Sep 30 20:57:40 compute-0 sudo[78464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:41 compute-0 python3.9[78466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:41 compute-0 sudo[78464]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:41 compute-0 sudo[78587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzaihndxvqoifdhhtxqprxdjrprwmadl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265860.5071177-342-6437581998897/AnsiballZ_copy.py'
Sep 30 20:57:41 compute-0 sudo[78587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:41 compute-0 python3.9[78589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265860.5071177-342-6437581998897/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=1fa5ae6ae379e41563e6246b7ae776ebb9ef7f6e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:41 compute-0 sudo[78587]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:42 compute-0 sudo[78739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lskjzjxyqyxeyslekjslfdcgaaakthmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265861.7950418-342-217760802182999/AnsiballZ_stat.py'
Sep 30 20:57:42 compute-0 sudo[78739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:42 compute-0 python3.9[78741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:42 compute-0 sudo[78739]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:42 compute-0 sudo[78862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybpbvjpflxkrqqkyqvfenghoqrovyvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265861.7950418-342-217760802182999/AnsiballZ_copy.py'
Sep 30 20:57:42 compute-0 sudo[78862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:42 compute-0 python3.9[78864]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265861.7950418-342-217760802182999/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c427c888ca1014d618bead98d3cdf4a14d714172 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:42 compute-0 sudo[78862]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:43 compute-0 sudo[79014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umwbztocaytcmqsihftjrprqjhtmtjeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265863.1594472-342-104693166220911/AnsiballZ_stat.py'
Sep 30 20:57:43 compute-0 sudo[79014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:43 compute-0 python3.9[79016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:43 compute-0 sudo[79014]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:44 compute-0 sudo[79137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsyzaclfunhxyhuoaftbsdcroovfkqcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265863.1594472-342-104693166220911/AnsiballZ_copy.py'
Sep 30 20:57:44 compute-0 sudo[79137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:44 compute-0 python3.9[79139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265863.1594472-342-104693166220911/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0fbe989aea724bd658703b25d7d5b521fc54044d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:44 compute-0 sudo[79137]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:44 compute-0 sudo[79289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hafbiphxqnrkxixanftksilrvaarxyob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265864.4676752-474-56613213222070/AnsiballZ_file.py'
Sep 30 20:57:44 compute-0 sudo[79289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:44 compute-0 python3.9[79291]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:45 compute-0 sudo[79289]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:45 compute-0 sudo[79441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqifuleeyuxrnlmanxweibrxqoxpxnll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265865.1847184-474-174573889685530/AnsiballZ_file.py'
Sep 30 20:57:45 compute-0 sudo[79441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:45 compute-0 python3.9[79443]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:45 compute-0 sudo[79441]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:46 compute-0 sudo[79593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-borqjleiukikohkioluayeygwzeosnoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265865.9621942-519-904857659105/AnsiballZ_stat.py'
Sep 30 20:57:46 compute-0 sudo[79593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:46 compute-0 python3.9[79595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:46 compute-0 sudo[79593]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:46 compute-0 sudo[79716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmqtzahkxyvsruzmhybowywpcrxmfkwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265865.9621942-519-904857659105/AnsiballZ_copy.py'
Sep 30 20:57:46 compute-0 sudo[79716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:47 compute-0 python3.9[79718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265865.9621942-519-904857659105/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5486da64a3846e386fe5eb185e3e41088fc7b67a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:47 compute-0 sudo[79716]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:47 compute-0 sudo[79868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luqdgbpboylfrdvjsieldpmxhrgfnotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265867.3402271-519-97247275753036/AnsiballZ_stat.py'
Sep 30 20:57:47 compute-0 sudo[79868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:47 compute-0 python3.9[79870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:47 compute-0 sudo[79868]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:48 compute-0 sudo[79991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyatnxniyjwuabjbbgjycrxetmgfffum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265867.3402271-519-97247275753036/AnsiballZ_copy.py'
Sep 30 20:57:48 compute-0 sudo[79991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:48 compute-0 python3.9[79993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265867.3402271-519-97247275753036/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=19543d2d8d9468ce3b30d32ed411afaff23b13eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:48 compute-0 sudo[79991]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:48 compute-0 sudo[80143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joyixwhmrwggadkhmiicsvdyaiiejxcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265868.6106844-519-252671620971090/AnsiballZ_stat.py'
Sep 30 20:57:48 compute-0 sudo[80143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:49 compute-0 python3.9[80145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:49 compute-0 sudo[80143]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:49 compute-0 sudo[80266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxmjfecijflmnjgepvmvhufspftnlbwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265868.6106844-519-252671620971090/AnsiballZ_copy.py'
Sep 30 20:57:49 compute-0 sudo[80266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:49 compute-0 python3.9[80268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265868.6106844-519-252671620971090/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2834fe6cf4cf301b782aae2e526f4bb070f7013e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:49 compute-0 sudo[80266]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:50 compute-0 sudo[80418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwbsdnwihuwqcpwghtglqkrxxhdkmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265870.071506-650-254861238212602/AnsiballZ_file.py'
Sep 30 20:57:50 compute-0 sudo[80418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:50 compute-0 python3.9[80420]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:50 compute-0 sudo[80418]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:51 compute-0 sudo[80570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdzdqbyaiohtcwuzngfjjegmwfnwmzxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265870.9615412-650-266956541670434/AnsiballZ_file.py'
Sep 30 20:57:51 compute-0 sudo[80570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:51 compute-0 python3.9[80572]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:51 compute-0 sudo[80570]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:52 compute-0 sudo[80722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrrqirhakapxersrytpruspwvlqljvcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265871.8263292-697-15969030367728/AnsiballZ_stat.py'
Sep 30 20:57:52 compute-0 sudo[80722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:52 compute-0 python3.9[80724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:52 compute-0 sudo[80722]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:52 compute-0 sudo[80845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgnciuqsaodisycpsgwvnyzaupyonqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265871.8263292-697-15969030367728/AnsiballZ_copy.py'
Sep 30 20:57:52 compute-0 sudo[80845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:52 compute-0 python3.9[80847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265871.8263292-697-15969030367728/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=fcb0f5be518c8e114e8f1405abe02fe6074228c2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:52 compute-0 sudo[80845]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:53 compute-0 sudo[80997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lubunjcknajrlxrhcxsprrbeahpqrcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265873.1370785-697-98532745167921/AnsiballZ_stat.py'
Sep 30 20:57:53 compute-0 sudo[80997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:53 compute-0 python3.9[80999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:53 compute-0 sudo[80997]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:54 compute-0 sudo[81120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obetvjlaqzxmgkaxphqipcrftfgmssms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265873.1370785-697-98532745167921/AnsiballZ_copy.py'
Sep 30 20:57:54 compute-0 sudo[81120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:54 compute-0 python3.9[81122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265873.1370785-697-98532745167921/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=19543d2d8d9468ce3b30d32ed411afaff23b13eb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:54 compute-0 sudo[81120]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:54 compute-0 sudo[81272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjsfcxfhkrijrhllcehmdxodcvpcwcbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265874.5489433-697-172246910499328/AnsiballZ_stat.py'
Sep 30 20:57:54 compute-0 sudo[81272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:55 compute-0 python3.9[81274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:55 compute-0 sudo[81272]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:55 compute-0 sudo[81395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfxvfgwtngwcjzbjfybdvfidjipuhyno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265874.5489433-697-172246910499328/AnsiballZ_copy.py'
Sep 30 20:57:55 compute-0 sudo[81395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:55 compute-0 python3.9[81397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265874.5489433-697-172246910499328/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1ceba88f488695e49cd45c6b552022d25c531cd1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:55 compute-0 sudo[81395]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:56 compute-0 sudo[81547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouifzjirfrqgkgiisbbtswazfkpnlpua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265876.548679-879-161489308692460/AnsiballZ_file.py'
Sep 30 20:57:56 compute-0 sudo[81547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:57 compute-0 python3.9[81549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:57 compute-0 sudo[81547]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:57 compute-0 sudo[81699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umvddkmnmekshexchuwefynavwlbzpsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265877.2534688-905-144838452901038/AnsiballZ_stat.py'
Sep 30 20:57:57 compute-0 sudo[81699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:57 compute-0 python3.9[81701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:57 compute-0 sudo[81699]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:58 compute-0 sudo[81822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eotbzxsgivyxawxooqbhwxlzlhxhhttz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265877.2534688-905-144838452901038/AnsiballZ_copy.py'
Sep 30 20:57:58 compute-0 sudo[81822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:58 compute-0 python3.9[81824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265877.2534688-905-144838452901038/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:57:58 compute-0 sudo[81822]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:58 compute-0 sudo[81974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsatydfsbisflzbdxsnpgaymudfractz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265878.5647957-953-16992503898031/AnsiballZ_file.py'
Sep 30 20:57:58 compute-0 sudo[81974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:59 compute-0 python3.9[81976]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:57:59 compute-0 sudo[81974]: pam_unix(sudo:session): session closed for user root
Sep 30 20:57:59 compute-0 sudo[82126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zaladjwwziidpyjxegbmolqctxowpxhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265879.3319833-977-238461060396684/AnsiballZ_stat.py'
Sep 30 20:57:59 compute-0 sudo[82126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:57:59 compute-0 python3.9[82128]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:57:59 compute-0 sudo[82126]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:00 compute-0 sudo[82249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwqptbkubivbfbmhkikdaikxvepowlik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265879.3319833-977-238461060396684/AnsiballZ_copy.py'
Sep 30 20:58:00 compute-0 sudo[82249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:00 compute-0 python3.9[82251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265879.3319833-977-238461060396684/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:00 compute-0 sudo[82249]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:01 compute-0 sudo[82401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcbsjojjogxlukfnmqrudmjoqwoypxlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265880.6667383-1023-36641610499188/AnsiballZ_file.py'
Sep 30 20:58:01 compute-0 sudo[82401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:01 compute-0 python3.9[82403]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:01 compute-0 sudo[82401]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:02 compute-0 sudo[82553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfjanubllpzenvpmisbkfinzxuxwkqca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265881.517971-1049-226018557811554/AnsiballZ_stat.py'
Sep 30 20:58:02 compute-0 sudo[82553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:02 compute-0 python3.9[82555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:02 compute-0 sudo[82553]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:02 compute-0 sudo[82676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wflusobxeayxqkxygnllgqslfwcxocuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265881.517971-1049-226018557811554/AnsiballZ_copy.py'
Sep 30 20:58:02 compute-0 sudo[82676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:02 compute-0 python3.9[82678]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265881.517971-1049-226018557811554/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:02 compute-0 sudo[82676]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:03 compute-0 sudo[82828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isvfvzfsaaoxtbdtbcsvueeqmrsdfvyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265883.1932397-1097-67586228174182/AnsiballZ_file.py'
Sep 30 20:58:03 compute-0 sudo[82828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:03 compute-0 python3.9[82830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:03 compute-0 sudo[82828]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:04 compute-0 sudo[82980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkgzcewoqzemrgfnfmhhdgwupcocppbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265884.0946453-1124-116787850691452/AnsiballZ_stat.py'
Sep 30 20:58:04 compute-0 sudo[82980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:04 compute-0 python3.9[82982]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:04 compute-0 sudo[82980]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:04 compute-0 sshd[1008]: drop connection #0 from [49.64.169.153]:44066 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 20:58:05 compute-0 sudo[83103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktxdltydvpqhwxrkemlzfgjznwqeabih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265884.0946453-1124-116787850691452/AnsiballZ_copy.py'
Sep 30 20:58:05 compute-0 sudo[83103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:05 compute-0 python3.9[83105]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265884.0946453-1124-116787850691452/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:05 compute-0 sudo[83103]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:05 compute-0 sudo[83255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrxgmgfhkfchbmfcfabkvxnilnuedjry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265885.5662067-1173-35426104380169/AnsiballZ_file.py'
Sep 30 20:58:05 compute-0 sudo[83255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:06 compute-0 python3.9[83257]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:06 compute-0 sudo[83255]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:06 compute-0 sudo[83407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umpxwkhbwujqbhxxnfudzsrctzeyljkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265886.3901687-1197-279195376329200/AnsiballZ_stat.py'
Sep 30 20:58:06 compute-0 sudo[83407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:06 compute-0 python3.9[83409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:06 compute-0 sudo[83407]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:07 compute-0 sudo[83530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqqexmmiwlyquwuhyufocefoxeygaoie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265886.3901687-1197-279195376329200/AnsiballZ_copy.py'
Sep 30 20:58:07 compute-0 sudo[83530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:07 compute-0 python3.9[83532]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265886.3901687-1197-279195376329200/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:07 compute-0 sudo[83530]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:08 compute-0 sudo[83682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naugsopxyxfraltevoryqkdqsdaybpsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265887.8149314-1246-67478041690519/AnsiballZ_file.py'
Sep 30 20:58:08 compute-0 sudo[83682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:08 compute-0 python3.9[83684]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:08 compute-0 sudo[83682]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:08 compute-0 sudo[83834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwwyeoweocpyybkaxfkjyooqtxtxqdrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265888.5076294-1269-260836733777990/AnsiballZ_stat.py'
Sep 30 20:58:08 compute-0 sudo[83834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:09 compute-0 python3.9[83836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:09 compute-0 sudo[83834]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:09 compute-0 sudo[83957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmdrtbfhrrmugvugabqkmeieqsuvptow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265888.5076294-1269-260836733777990/AnsiballZ_copy.py'
Sep 30 20:58:09 compute-0 sudo[83957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:09 compute-0 python3.9[83959]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265888.5076294-1269-260836733777990/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:09 compute-0 sudo[83957]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:10 compute-0 sudo[84109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrbagjowbhtevnuinzniupbayurgsowh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265889.9313452-1318-82049387560853/AnsiballZ_file.py'
Sep 30 20:58:10 compute-0 sudo[84109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:10 compute-0 python3.9[84111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:10 compute-0 sudo[84109]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:11 compute-0 sudo[84261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnbirsgkvogdnjyvywkzxfyoacgqorfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265890.6787014-1342-271415269801576/AnsiballZ_stat.py'
Sep 30 20:58:11 compute-0 sudo[84261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:11 compute-0 python3.9[84263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:11 compute-0 sudo[84261]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:11 compute-0 sudo[84384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmkfntqjiijacecjalukjpdatwmybicl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265890.6787014-1342-271415269801576/AnsiballZ_copy.py'
Sep 30 20:58:11 compute-0 sudo[84384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:11 compute-0 python3.9[84386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265890.6787014-1342-271415269801576/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=686f1a4d8f59010e2c99342c60da63269ac3f94e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:11 compute-0 sudo[84384]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:12 compute-0 sshd-session[76727]: Connection closed by 192.168.122.30 port 40216
Sep 30 20:58:12 compute-0 sshd-session[76724]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:58:12 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Sep 30 20:58:12 compute-0 systemd[1]: session-19.scope: Consumed 32.132s CPU time.
Sep 30 20:58:12 compute-0 systemd-logind[792]: Session 19 logged out. Waiting for processes to exit.
Sep 30 20:58:12 compute-0 systemd-logind[792]: Removed session 19.
Sep 30 20:58:13 compute-0 PackageKit[31761]: daemon quit
Sep 30 20:58:13 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 20:58:17 compute-0 sshd-session[84411]: Accepted publickey for zuul from 192.168.122.30 port 51134 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:58:17 compute-0 systemd-logind[792]: New session 20 of user zuul.
Sep 30 20:58:17 compute-0 systemd[1]: Started Session 20 of User zuul.
Sep 30 20:58:17 compute-0 sshd-session[84411]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:58:18 compute-0 python3.9[84564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:19 compute-0 sudo[84718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdwvbyamhkjmsorgwjnotvsmqdbykqqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265898.9467733-67-248191947642630/AnsiballZ_file.py'
Sep 30 20:58:19 compute-0 sudo[84718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:19 compute-0 python3.9[84720]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:19 compute-0 sudo[84718]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:20 compute-0 sudo[84870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxskuaszedvlrfqybxggpcazwnrnpqvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265899.8940432-67-208844204310293/AnsiballZ_file.py'
Sep 30 20:58:20 compute-0 sudo[84870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:20 compute-0 python3.9[84872]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:20 compute-0 sudo[84870]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:21 compute-0 python3.9[85022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:21 compute-0 sudo[85172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqtixibfclwzjqjapbncjnnqomjcovfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265901.454546-136-227879961047976/AnsiballZ_seboolean.py'
Sep 30 20:58:21 compute-0 sudo[85172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:22 compute-0 python3.9[85174]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 20:58:23 compute-0 sudo[85172]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:23 compute-0 sudo[85328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryzbyxobscdrjgickrmjqttyjdeyycko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265903.6185994-166-12571851761963/AnsiballZ_setup.py'
Sep 30 20:58:23 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Sep 30 20:58:23 compute-0 sudo[85328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:24 compute-0 python3.9[85330]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:58:24 compute-0 sudo[85328]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:24 compute-0 sudo[85412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbudewkvspvfwgffrrpiffgayvnhbztt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265903.6185994-166-12571851761963/AnsiballZ_dnf.py'
Sep 30 20:58:24 compute-0 sudo[85412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:25 compute-0 python3.9[85414]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:58:26 compute-0 sudo[85412]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:27 compute-0 sudo[85565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocpyoankhaqzerttqvogheglgagdnenv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265906.5597703-202-44401161873069/AnsiballZ_systemd.py'
Sep 30 20:58:27 compute-0 sudo[85565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:27 compute-0 python3.9[85567]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:58:27 compute-0 sudo[85565]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:28 compute-0 sudo[85720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdecztglkfsqgjtsinheknifzhzxewur ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265907.855989-226-209138130133479/AnsiballZ_edpm_nftables_snippet.py'
Sep 30 20:58:28 compute-0 sudo[85720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:28 compute-0 python3[85722]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Sep 30 20:58:28 compute-0 sudo[85720]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:29 compute-0 sudo[85872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jatiojvshwwptoyummawoubjtyoaetcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265908.8584263-253-246333488908837/AnsiballZ_file.py'
Sep 30 20:58:29 compute-0 sudo[85872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:29 compute-0 python3.9[85874]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:29 compute-0 sudo[85872]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:30 compute-0 sudo[86024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojiexmfyrvscejlfsdmoeikgnzlenqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265909.6455557-277-936545080900/AnsiballZ_stat.py'
Sep 30 20:58:30 compute-0 sudo[86024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:30 compute-0 python3.9[86026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:30 compute-0 sudo[86024]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:30 compute-0 sudo[86102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdjnfopgcwxucbfogpjubungcxhcoxou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265909.6455557-277-936545080900/AnsiballZ_file.py'
Sep 30 20:58:30 compute-0 sudo[86102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:30 compute-0 python3.9[86104]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:30 compute-0 sudo[86102]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:31 compute-0 sudo[86254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfztfmsuwjnlgqfgufrziorzdafqnaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265911.0448675-313-186562136327331/AnsiballZ_stat.py'
Sep 30 20:58:31 compute-0 sudo[86254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:31 compute-0 python3.9[86256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:31 compute-0 sudo[86254]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:31 compute-0 sudo[86332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvncdghqgkxcgyctxssupgfijyuogbza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265911.0448675-313-186562136327331/AnsiballZ_file.py'
Sep 30 20:58:31 compute-0 sudo[86332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:32 compute-0 python3.9[86334]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4nnckf9i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:32 compute-0 sudo[86332]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:32 compute-0 sudo[86484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riairsasgxppsyzsrpnxbobgvsswhvgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265912.269821-349-19829020649126/AnsiballZ_stat.py'
Sep 30 20:58:32 compute-0 sudo[86484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:32 compute-0 python3.9[86486]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:32 compute-0 sudo[86484]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:33 compute-0 sudo[86562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utvgerbwpiewaiurrywdubpogyylmcqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265912.269821-349-19829020649126/AnsiballZ_file.py'
Sep 30 20:58:33 compute-0 sudo[86562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:33 compute-0 python3.9[86564]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:33 compute-0 sudo[86562]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:34 compute-0 sudo[86714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phvfcupkaryjobcganucdvxaoljymdxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265913.579943-388-251502042183812/AnsiballZ_command.py'
Sep 30 20:58:34 compute-0 sudo[86714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:34 compute-0 python3.9[86716]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:34 compute-0 sudo[86714]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:35 compute-0 sudo[86867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjnumleykdovkpujmtoisvhuhzzznpyr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265914.5883446-412-22130892857543/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 20:58:35 compute-0 sudo[86867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:35 compute-0 python3[86869]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 20:58:35 compute-0 sudo[86867]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:35 compute-0 sudo[87019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzdserkvgojnhbfjylbaezazgxcmudsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265915.5923676-436-2240138157760/AnsiballZ_stat.py'
Sep 30 20:58:35 compute-0 sudo[87019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:36 compute-0 python3.9[87021]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:36 compute-0 sudo[87019]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:36 compute-0 sudo[87144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdfschqwwaujmdzveqylwdeoavtavwij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265915.5923676-436-2240138157760/AnsiballZ_copy.py'
Sep 30 20:58:36 compute-0 sudo[87144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:36 compute-0 python3.9[87146]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265915.5923676-436-2240138157760/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:36 compute-0 sudo[87144]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:37 compute-0 sudo[87296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwbnalfmnrlvztgppfephcxgnbuewggu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265917.2400618-481-216609462364662/AnsiballZ_stat.py'
Sep 30 20:58:37 compute-0 sudo[87296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:37 compute-0 python3.9[87298]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:37 compute-0 sudo[87296]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:38 compute-0 sudo[87421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivifpuaajfelwvvucbpqlbpmbfvhtqur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265917.2400618-481-216609462364662/AnsiballZ_copy.py'
Sep 30 20:58:38 compute-0 sudo[87421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:38 compute-0 python3.9[87423]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265917.2400618-481-216609462364662/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:38 compute-0 sudo[87421]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:39 compute-0 sudo[87574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duxrhtenocthgnqgjvbvorrxikpvirzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265918.9040334-526-23566041241265/AnsiballZ_stat.py'
Sep 30 20:58:39 compute-0 sudo[87574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:39 compute-0 python3.9[87576]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:39 compute-0 sudo[87574]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:39 compute-0 sudo[87699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llgrrkgoiqnuhozyirwfipkkihcsmqcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265918.9040334-526-23566041241265/AnsiballZ_copy.py'
Sep 30 20:58:39 compute-0 sudo[87699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:39 compute-0 python3.9[87701]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265918.9040334-526-23566041241265/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:39 compute-0 sudo[87699]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:40 compute-0 sudo[87851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccipxphrcghyhooozqhjlerskaujgoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265920.4290242-571-120119159015210/AnsiballZ_stat.py'
Sep 30 20:58:40 compute-0 sudo[87851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:40 compute-0 python3.9[87853]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:40 compute-0 sudo[87851]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:41 compute-0 sudo[87976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awsfnqkeabwgrhpstzyusxuiddiloyxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265920.4290242-571-120119159015210/AnsiballZ_copy.py'
Sep 30 20:58:41 compute-0 sudo[87976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:41 compute-0 python3.9[87978]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265920.4290242-571-120119159015210/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:41 compute-0 sudo[87976]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:42 compute-0 sudo[88128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iicsiqjikavbvevoduogqghgggrlxtem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265921.9028661-616-274761755657002/AnsiballZ_stat.py'
Sep 30 20:58:42 compute-0 sudo[88128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:42 compute-0 python3.9[88130]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:42 compute-0 sudo[88128]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:42 compute-0 sudo[88253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exsyqkrmepneorrphqchbzhadirjiomm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265921.9028661-616-274761755657002/AnsiballZ_copy.py'
Sep 30 20:58:42 compute-0 sudo[88253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:43 compute-0 python3.9[88255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759265921.9028661-616-274761755657002/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:43 compute-0 sudo[88253]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:43 compute-0 sudo[88405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxouirphrsrgdqgifpajgcsnaslhwfxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265923.4767141-661-39423275408089/AnsiballZ_file.py'
Sep 30 20:58:43 compute-0 sudo[88405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:43 compute-0 python3.9[88407]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:43 compute-0 sudo[88405]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:44 compute-0 sudo[88557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdkpvtafwqtbecvxkqegirdrrkmmsjsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265924.180452-685-32673722347222/AnsiballZ_command.py'
Sep 30 20:58:44 compute-0 sudo[88557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:44 compute-0 python3.9[88559]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:44 compute-0 sudo[88557]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:45 compute-0 sudo[88712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxebingtkqlmtfqwagdxxtwlfjuwhped ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265925.0632432-709-97323177205690/AnsiballZ_blockinfile.py'
Sep 30 20:58:45 compute-0 sudo[88712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:45 compute-0 python3.9[88714]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:45 compute-0 sudo[88712]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:46 compute-0 sudo[88864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woluzcjoetwhxhozzdxrxpxulikrblvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265926.236076-736-195342591650781/AnsiballZ_command.py'
Sep 30 20:58:46 compute-0 sudo[88864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:46 compute-0 python3.9[88866]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:46 compute-0 sudo[88864]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:47 compute-0 sudo[89017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpgvhehshnimnblnacjlhxcwautpxxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265927.038063-760-151449759655129/AnsiballZ_stat.py'
Sep 30 20:58:47 compute-0 sudo[89017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:47 compute-0 python3.9[89019]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:58:47 compute-0 sudo[89017]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:48 compute-0 sudo[89171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arpiiunlvyqnleltfqynreipqkgpxacr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265927.7821755-784-264134541734055/AnsiballZ_command.py'
Sep 30 20:58:48 compute-0 sudo[89171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:48 compute-0 python3.9[89173]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:48 compute-0 sudo[89171]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:49 compute-0 sudo[89327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iridtjfoiacyjicnlnchccumbxcessii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265928.9820623-808-260490844422217/AnsiballZ_file.py'
Sep 30 20:58:49 compute-0 sudo[89327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:49 compute-0 python3.9[89329]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:49 compute-0 sudo[89327]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:50 compute-0 python3.9[89479]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:58:51 compute-0 sudo[89630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtlpnvsinhtvuhuuvcelrltxqkwimjfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265931.6345184-928-272885662940861/AnsiballZ_command.py'
Sep 30 20:58:51 compute-0 sudo[89630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:52 compute-0 python3.9[89632]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:52 compute-0 ovs-vsctl[89633]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Sep 30 20:58:52 compute-0 sudo[89630]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:52 compute-0 sudo[89783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrfzptghhkvxnorsvinirfamaqfvsdze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265932.5238438-955-260323727096416/AnsiballZ_command.py'
Sep 30 20:58:52 compute-0 sudo[89783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:53 compute-0 python3.9[89785]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:53 compute-0 sudo[89783]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:53 compute-0 sudo[89938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epbdqtfaflpiirpkhzkyqcnluixcpqbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265933.4197235-979-56803349473039/AnsiballZ_command.py'
Sep 30 20:58:53 compute-0 sudo[89938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:53 compute-0 python3.9[89940]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:58:53 compute-0 ovs-vsctl[89941]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Sep 30 20:58:53 compute-0 sudo[89938]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:54 compute-0 python3.9[90091]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:58:55 compute-0 sudo[90243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxupobiptlvtaenewzoqxibygpsdpsii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.0761821-1030-109730002133644/AnsiballZ_file.py'
Sep 30 20:58:55 compute-0 sudo[90243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:55 compute-0 python3.9[90245]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:55 compute-0 sudo[90243]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:56 compute-0 sudo[90395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcrnnbskuveyvtalxwbsujycpnfnpnbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.9385645-1054-259777526023818/AnsiballZ_stat.py'
Sep 30 20:58:56 compute-0 sudo[90395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:56 compute-0 python3.9[90397]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:56 compute-0 sudo[90395]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:56 compute-0 sudo[90473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xflruvuwirmkjidojrdnxgypqzheoldj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265935.9385645-1054-259777526023818/AnsiballZ_file.py'
Sep 30 20:58:56 compute-0 sudo[90473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:56 compute-0 python3.9[90475]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:57 compute-0 sudo[90473]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:57 compute-0 sudo[90625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfpicvlffxuzbqqnhksuigmmuhchatzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265937.167967-1054-77741269374437/AnsiballZ_stat.py'
Sep 30 20:58:57 compute-0 sudo[90625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:57 compute-0 python3.9[90627]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:58:57 compute-0 sudo[90625]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:57 compute-0 sudo[90703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxsufnhatzyoxehhvwwrxiicufzpbysy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265937.167967-1054-77741269374437/AnsiballZ_file.py'
Sep 30 20:58:57 compute-0 sudo[90703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:58 compute-0 python3.9[90705]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:58:58 compute-0 sudo[90703]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:59 compute-0 sudo[90855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-narxknfzxhftehqhqxhrevubmnlkpolj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265938.6275563-1123-22694388216291/AnsiballZ_file.py'
Sep 30 20:58:59 compute-0 sudo[90855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:58:59 compute-0 python3.9[90857]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:58:59 compute-0 sudo[90855]: pam_unix(sudo:session): session closed for user root
Sep 30 20:58:59 compute-0 sudo[91007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtohfdphiyppcpqkdzksmdqsybsjvclw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265939.5312035-1147-64933257750369/AnsiballZ_stat.py'
Sep 30 20:58:59 compute-0 sudo[91007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:00 compute-0 python3.9[91009]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:00 compute-0 sudo[91007]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:00 compute-0 sudo[91085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azfrkagkijglezocjfxxsnvozdaapywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265939.5312035-1147-64933257750369/AnsiballZ_file.py'
Sep 30 20:59:00 compute-0 sudo[91085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:00 compute-0 python3.9[91087]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:00 compute-0 sudo[91085]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:01 compute-0 sudo[91237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzjyevduijeentpcjyiblwzxpipqfwfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265941.3129897-1183-46550735898372/AnsiballZ_stat.py'
Sep 30 20:59:01 compute-0 sudo[91237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:01 compute-0 python3.9[91239]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:01 compute-0 sudo[91237]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:02 compute-0 sudo[91315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aacwfancoeehnggufhzobkgditvhnqkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265941.3129897-1183-46550735898372/AnsiballZ_file.py'
Sep 30 20:59:02 compute-0 sudo[91315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:02 compute-0 python3.9[91317]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:02 compute-0 sudo[91315]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:02 compute-0 sudo[91467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzgvfyzmkdwnuwodgifbpgrhtwthvyzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265942.6426463-1219-239317505828126/AnsiballZ_systemd.py'
Sep 30 20:59:02 compute-0 sudo[91467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:03 compute-0 python3.9[91469]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:03 compute-0 systemd[1]: Reloading.
Sep 30 20:59:03 compute-0 systemd-rc-local-generator[91493]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:03 compute-0 systemd-sysv-generator[91500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:03 compute-0 sudo[91467]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:04 compute-0 sudo[91657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsgntrngxenknvirbvnkizuvmzoqkqgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265943.8825505-1243-83581144520521/AnsiballZ_stat.py'
Sep 30 20:59:04 compute-0 sudo[91657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:04 compute-0 python3.9[91659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:04 compute-0 sudo[91657]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:04 compute-0 sudo[91735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntgsjtffomatrdjhhncyfgezulqgthbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265943.8825505-1243-83581144520521/AnsiballZ_file.py'
Sep 30 20:59:04 compute-0 sudo[91735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:05 compute-0 python3.9[91737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:05 compute-0 sudo[91735]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:05 compute-0 sudo[91887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfohlhwzgjatcmzjpyfzmnfdcoyvudhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265945.3105252-1279-120514297120553/AnsiballZ_stat.py'
Sep 30 20:59:05 compute-0 sudo[91887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:05 compute-0 python3.9[91889]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:05 compute-0 sudo[91887]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:06 compute-0 sudo[91965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eofiinqwehcntaqulymwsafmxnbvyypw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265945.3105252-1279-120514297120553/AnsiballZ_file.py'
Sep 30 20:59:06 compute-0 sudo[91965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:06 compute-0 python3.9[91967]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:06 compute-0 sudo[91965]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:07 compute-0 sudo[92117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uedunlvvubmoisyahnjgbvcvjobcjnov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265946.6650596-1315-177329857385096/AnsiballZ_systemd.py'
Sep 30 20:59:07 compute-0 sudo[92117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:07 compute-0 python3.9[92119]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:07 compute-0 systemd[1]: Reloading.
Sep 30 20:59:07 compute-0 systemd-rc-local-generator[92147]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:07 compute-0 systemd-sysv-generator[92153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:07 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 20:59:07 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 20:59:07 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 20:59:07 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 20:59:07 compute-0 sudo[92117]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:09 compute-0 sudo[92312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjnfbrgogarpwgwljaatxtirlqmfbmjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265948.9990213-1345-142883024388503/AnsiballZ_file.py'
Sep 30 20:59:09 compute-0 sudo[92312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:09 compute-0 python3.9[92314]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:09 compute-0 sudo[92312]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:10 compute-0 sudo[92464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbdjgqthcnoakloojeociofufrrzrdax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265949.8516986-1369-255706882790453/AnsiballZ_stat.py'
Sep 30 20:59:10 compute-0 sudo[92464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:10 compute-0 python3.9[92466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:10 compute-0 sudo[92464]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:10 compute-0 sudo[92587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrsykldxtjurvwbjveoltkwwbizxvlpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265949.8516986-1369-255706882790453/AnsiballZ_copy.py'
Sep 30 20:59:10 compute-0 sudo[92587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:10 compute-0 python3.9[92589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265949.8516986-1369-255706882790453/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:10 compute-0 sudo[92587]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:11 compute-0 sudo[92739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfkfgqoqsqurehoiewnifotxlleuhizr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265951.5548708-1420-43868684769615/AnsiballZ_file.py'
Sep 30 20:59:11 compute-0 sudo[92739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:12 compute-0 python3.9[92741]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:12 compute-0 sudo[92739]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:12 compute-0 sudo[92891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbjcyxmdclxrucurfxzrilmfppgrwesz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265952.4210598-1444-5554123573861/AnsiballZ_stat.py'
Sep 30 20:59:12 compute-0 sudo[92891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:12 compute-0 python3.9[92893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:12 compute-0 sudo[92891]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:13 compute-0 sudo[93014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuaslfomnccszqnoepgxfuxyuaqvgmmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265952.4210598-1444-5554123573861/AnsiballZ_copy.py'
Sep 30 20:59:13 compute-0 sudo[93014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:13 compute-0 python3.9[93016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759265952.4210598-1444-5554123573861/.source.json _original_basename=.1kvja1oy follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:13 compute-0 sudo[93014]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:14 compute-0 sudo[93166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrzkawswmdqzuaotumxtflcnftaqgabh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265954.0339668-1489-226519947549637/AnsiballZ_file.py'
Sep 30 20:59:14 compute-0 sudo[93166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:14 compute-0 python3.9[93168]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:14 compute-0 sudo[93166]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:15 compute-0 sudo[93318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qllrkqvdmafpawcbceipvgglpfpglxqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265954.9776824-1513-85173961486933/AnsiballZ_stat.py'
Sep 30 20:59:15 compute-0 sudo[93318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:15 compute-0 sudo[93318]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:15 compute-0 sudo[93441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcisdodygcetjacxfwxjwaauipafpuhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265954.9776824-1513-85173961486933/AnsiballZ_copy.py'
Sep 30 20:59:15 compute-0 sudo[93441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:16 compute-0 sudo[93441]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:17 compute-0 sudo[93593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txcxuvosacxuoeelmikhzfsddyucqfwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265956.5751295-1564-165839694163215/AnsiballZ_container_config_data.py'
Sep 30 20:59:17 compute-0 sudo[93593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:17 compute-0 python3.9[93595]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Sep 30 20:59:17 compute-0 sudo[93593]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:18 compute-0 sudo[93745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjfpwmsuldqwzbwdefkrpdwahznempst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265957.5917196-1591-171275205855852/AnsiballZ_container_config_hash.py'
Sep 30 20:59:18 compute-0 sudo[93745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:18 compute-0 python3.9[93747]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 20:59:18 compute-0 sudo[93745]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:19 compute-0 sudo[93897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtbwumuwamjikutikedjtnglfsiweykx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265958.6436448-1618-254812766896582/AnsiballZ_podman_container_info.py'
Sep 30 20:59:19 compute-0 sudo[93897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:19 compute-0 python3.9[93899]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 20:59:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:19 compute-0 sudo[93897]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:20 compute-0 sudo[94060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gikjghicoiilxykutahumbnbtowqwabu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759265960.0369642-1657-169088170134318/AnsiballZ_edpm_container_manage.py'
Sep 30 20:59:20 compute-0 sudo[94060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:20 compute-0 python3[94062]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 20:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:21 compute-0 podman[94097]: 2025-09-30 20:59:21.102294515 +0000 UTC m=+0.049630527 container create aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Sep 30 20:59:21 compute-0 podman[94097]: 2025-09-30 20:59:21.074844417 +0000 UTC m=+0.022180409 image pull 7ffac6b06b247caf26cf673b775a5f070f2fa1a6008cf0b0964af7e905ba86a5 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:59:21 compute-0 python3[94062]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Sep 30 20:59:21 compute-0 sudo[94060]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 20:59:22 compute-0 sudo[94285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubwaaeuqrslmbicgshmgjbzpbllozhlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265961.9880788-1681-159627499466618/AnsiballZ_stat.py'
Sep 30 20:59:22 compute-0 sudo[94285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:22 compute-0 python3.9[94287]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:59:22 compute-0 sudo[94285]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:23 compute-0 sudo[94439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djrveswuigkxevtvbojgitazkawogsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265962.8113399-1708-145949563596385/AnsiballZ_file.py'
Sep 30 20:59:23 compute-0 sudo[94439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:23 compute-0 python3.9[94441]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:23 compute-0 sudo[94439]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:23 compute-0 sudo[94515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afvxcsmjtgwzrudnkuvychgipuulmzcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265962.8113399-1708-145949563596385/AnsiballZ_stat.py'
Sep 30 20:59:23 compute-0 sudo[94515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:23 compute-0 python3.9[94517]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 20:59:23 compute-0 sudo[94515]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:24 compute-0 sudo[94666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzhjxjroyprfapdmdruqmtqszbxqqmka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.8219016-1708-18682696413481/AnsiballZ_copy.py'
Sep 30 20:59:24 compute-0 sudo[94666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:24 compute-0 python3.9[94668]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759265963.8219016-1708-18682696413481/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 20:59:24 compute-0 sudo[94666]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:24 compute-0 sudo[94742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euvzylcsakideiemmmctapanchwfzrsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.8219016-1708-18682696413481/AnsiballZ_systemd.py'
Sep 30 20:59:24 compute-0 sudo[94742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:25 compute-0 python3.9[94744]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 20:59:25 compute-0 systemd[1]: Reloading.
Sep 30 20:59:25 compute-0 systemd-rc-local-generator[94771]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:25 compute-0 systemd-sysv-generator[94775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:25 compute-0 sudo[94742]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:25 compute-0 sudo[94852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqnhhbdbeeagoohrozpvmapvlgmtgvkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265963.8219016-1708-18682696413481/AnsiballZ_systemd.py'
Sep 30 20:59:25 compute-0 sudo[94852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:26 compute-0 python3.9[94854]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 20:59:26 compute-0 systemd[1]: Reloading.
Sep 30 20:59:26 compute-0 systemd-rc-local-generator[94882]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:26 compute-0 systemd-sysv-generator[94888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:26 compute-0 systemd[1]: Starting ovn_controller container...
Sep 30 20:59:26 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Sep 30 20:59:26 compute-0 systemd[1]: Started libcrun container.
Sep 30 20:59:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f7ae83ee188fedf7d4d195dfa8fb56e6bf5d0e778737ad5ede3f862fdd3b6a2/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 20:59:26 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660.
Sep 30 20:59:26 compute-0 podman[94896]: 2025-09-30 20:59:26.621774596 +0000 UTC m=+0.122912821 container init aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 20:59:26 compute-0 ovn_controller[94912]: + sudo -E kolla_set_configs
Sep 30 20:59:26 compute-0 podman[94896]: 2025-09-30 20:59:26.654733121 +0000 UTC m=+0.155871286 container start aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 20:59:26 compute-0 edpm-start-podman-container[94896]: ovn_controller
Sep 30 20:59:26 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 20:59:26 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 20:59:26 compute-0 edpm-start-podman-container[94895]: Creating additional drop-in dependency for "ovn_controller" (aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660)
Sep 30 20:59:26 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 20:59:26 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 20:59:26 compute-0 podman[94919]: 2025-09-30 20:59:26.733588052 +0000 UTC m=+0.065420566 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 20:59:26 compute-0 systemd[94956]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 20:59:26 compute-0 systemd[1]: aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660-2367ca8696d49583.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 20:59:26 compute-0 systemd[1]: aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660-2367ca8696d49583.service: Failed with result 'exit-code'.
Sep 30 20:59:26 compute-0 systemd[1]: Reloading.
Sep 30 20:59:26 compute-0 systemd-rc-local-generator[94993]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 20:59:26 compute-0 systemd-sysv-generator[94998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 20:59:26 compute-0 systemd[94956]: Queued start job for default target Main User Target.
Sep 30 20:59:26 compute-0 systemd[94956]: Created slice User Application Slice.
Sep 30 20:59:26 compute-0 systemd[94956]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 20:59:26 compute-0 systemd[94956]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 20:59:26 compute-0 systemd[94956]: Reached target Paths.
Sep 30 20:59:26 compute-0 systemd[94956]: Reached target Timers.
Sep 30 20:59:26 compute-0 systemd[94956]: Starting D-Bus User Message Bus Socket...
Sep 30 20:59:26 compute-0 systemd[94956]: Starting Create User's Volatile Files and Directories...
Sep 30 20:59:26 compute-0 systemd[94956]: Listening on D-Bus User Message Bus Socket.
Sep 30 20:59:26 compute-0 systemd[94956]: Reached target Sockets.
Sep 30 20:59:26 compute-0 systemd[94956]: Finished Create User's Volatile Files and Directories.
Sep 30 20:59:26 compute-0 systemd[94956]: Reached target Basic System.
Sep 30 20:59:26 compute-0 systemd[94956]: Reached target Main User Target.
Sep 30 20:59:26 compute-0 systemd[94956]: Startup finished in 125ms.
Sep 30 20:59:26 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 20:59:26 compute-0 systemd[1]: Started ovn_controller container.
Sep 30 20:59:26 compute-0 systemd[1]: Started Session c1 of User root.
Sep 30 20:59:26 compute-0 sudo[94852]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:27 compute-0 ovn_controller[94912]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 20:59:27 compute-0 ovn_controller[94912]: INFO:__main__:Validating config file
Sep 30 20:59:27 compute-0 ovn_controller[94912]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 20:59:27 compute-0 ovn_controller[94912]: INFO:__main__:Writing out command to execute
Sep 30 20:59:27 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: ++ cat /run_command
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + ARGS=
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + sudo kolla_copy_cacerts
Sep 30 20:59:27 compute-0 systemd[1]: Started Session c2 of User root.
Sep 30 20:59:27 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + [[ ! -n '' ]]
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + . kolla_extend_start
Sep 30 20:59:27 compute-0 ovn_controller[94912]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + umask 0022
Sep 30 20:59:27 compute-0 ovn_controller[94912]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.1741] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.1751] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 20:59:27 compute-0 kernel: br-int: entered promiscuous mode
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.1778] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.1794] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.1806] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00014|main|INFO|OVS feature set changed, force recompute.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00022|main|INFO|OVS feature set changed, force recompute.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Sep 30 20:59:27 compute-0 systemd-udevd[95044]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-0 ovn_controller[94912]: 2025-09-30T20:59:27Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.2563] manager: (ovn-7db753-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.2600] manager: (ovn-78438f-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Sep 30 20:59:27 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.2752] device (genev_sys_6081): carrier: link connected
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.2754] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Sep 30 20:59:27 compute-0 NetworkManager[51733]: <info>  [1759265967.9171] manager: (ovn-1c612d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Sep 30 20:59:28 compute-0 sudo[95174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jboiaoiyjcyskeoasdfqojtcgmbjahpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265967.9096236-1792-170931203098385/AnsiballZ_command.py'
Sep 30 20:59:28 compute-0 sudo[95174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:28 compute-0 python3.9[95176]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:28 compute-0 ovs-vsctl[95177]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Sep 30 20:59:28 compute-0 sudo[95174]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:28 compute-0 sudo[95327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgzjskgaulieqoolhbvblcvwdiqzhftc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265968.67841-1816-15116872085615/AnsiballZ_command.py'
Sep 30 20:59:28 compute-0 sudo[95327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:29 compute-0 python3.9[95329]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:29 compute-0 ovs-vsctl[95331]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Sep 30 20:59:29 compute-0 sudo[95327]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:31 compute-0 sudo[95482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xksubrqbymmdregjiicyxsssuatiltff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265970.8392477-1858-3943481703409/AnsiballZ_command.py'
Sep 30 20:59:31 compute-0 sudo[95482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:31 compute-0 python3.9[95484]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 20:59:31 compute-0 ovs-vsctl[95485]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Sep 30 20:59:31 compute-0 sudo[95482]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:31 compute-0 sshd-session[84414]: Connection closed by 192.168.122.30 port 51134
Sep 30 20:59:31 compute-0 sshd-session[84411]: pam_unix(sshd:session): session closed for user zuul
Sep 30 20:59:31 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Sep 30 20:59:31 compute-0 systemd[1]: session-20.scope: Consumed 48.597s CPU time.
Sep 30 20:59:31 compute-0 systemd-logind[792]: Session 20 logged out. Waiting for processes to exit.
Sep 30 20:59:31 compute-0 systemd-logind[792]: Removed session 20.
Sep 30 20:59:35 compute-0 sshd-session[95510]: Connection closed by authenticating user root 80.94.95.115 port 56068 [preauth]
Sep 30 20:59:36 compute-0 sshd-session[95512]: Accepted publickey for zuul from 192.168.122.30 port 52612 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 20:59:36 compute-0 systemd-logind[792]: New session 22 of user zuul.
Sep 30 20:59:36 compute-0 systemd[1]: Started Session 22 of User zuul.
Sep 30 20:59:36 compute-0 sshd-session[95512]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 20:59:37 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 20:59:37 compute-0 systemd[94956]: Activating special unit Exit the Session...
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped target Main User Target.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped target Basic System.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped target Paths.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped target Sockets.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped target Timers.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 20:59:37 compute-0 systemd[94956]: Closed D-Bus User Message Bus Socket.
Sep 30 20:59:37 compute-0 systemd[94956]: Stopped Create User's Volatile Files and Directories.
Sep 30 20:59:37 compute-0 systemd[94956]: Removed slice User Application Slice.
Sep 30 20:59:37 compute-0 systemd[94956]: Reached target Shutdown.
Sep 30 20:59:37 compute-0 systemd[94956]: Finished Exit the Session.
Sep 30 20:59:37 compute-0 systemd[94956]: Reached target Exit the Session.
Sep 30 20:59:37 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 20:59:37 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 20:59:37 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 20:59:37 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 20:59:37 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 20:59:37 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 20:59:37 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 20:59:38 compute-0 python3.9[95670]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:59:39 compute-0 sudo[95824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoxvsxgzdvohodpddasqnuylrsnguzyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265978.5135577-67-154714992044/AnsiballZ_file.py'
Sep 30 20:59:39 compute-0 sudo[95824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:39 compute-0 python3.9[95826]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:39 compute-0 sudo[95824]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:39 compute-0 sudo[95976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yowyqqrxtwzdzarkeykbniaxczcfjisf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265979.4240654-67-247484395671366/AnsiballZ_file.py'
Sep 30 20:59:39 compute-0 sudo[95976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:39 compute-0 python3.9[95978]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:39 compute-0 sudo[95976]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:40 compute-0 sudo[96128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eblxnjvgqzfvcwgrnbttkgfdqlbbkcdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265980.0923357-67-67245808094885/AnsiballZ_file.py'
Sep 30 20:59:40 compute-0 sudo[96128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:40 compute-0 python3.9[96130]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:40 compute-0 sudo[96128]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:41 compute-0 sudo[96280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbkcujeaieojzxfujhfucfpvvybsjypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265980.8296142-67-94322766262820/AnsiballZ_file.py'
Sep 30 20:59:41 compute-0 sudo[96280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:41 compute-0 python3.9[96282]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:41 compute-0 sudo[96280]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:41 compute-0 sudo[96432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-badwdfulqomyjcxggniovzyrmzxuqepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265981.5889962-67-158949675363951/AnsiballZ_file.py'
Sep 30 20:59:41 compute-0 sudo[96432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:42 compute-0 python3.9[96434]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:42 compute-0 sudo[96432]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:43 compute-0 python3.9[96584]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 20:59:44 compute-0 sudo[96734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izvuwtkgjhdeefspynxdabvcmweaztod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265983.5214546-199-200826210735053/AnsiballZ_seboolean.py'
Sep 30 20:59:44 compute-0 sudo[96734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:44 compute-0 python3.9[96736]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 20:59:44 compute-0 sudo[96734]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:45 compute-0 python3.9[96887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:46 compute-0 python3.9[97008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265985.1275175-223-43102992143358/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:47 compute-0 python3.9[97158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:47 compute-0 python3.9[97279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265986.739074-268-63652641166000/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:48 compute-0 sudo[97429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hycnjscdcmkaqxrvorgpcyllucwnbgzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265988.2013094-319-114339986568466/AnsiballZ_setup.py'
Sep 30 20:59:48 compute-0 sudo[97429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:48 compute-0 python3.9[97431]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 20:59:49 compute-0 sudo[97429]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:49 compute-0 sudo[97513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jletbsrlwmducwitqtzxesorawjvuqug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265988.2013094-319-114339986568466/AnsiballZ_dnf.py'
Sep 30 20:59:49 compute-0 sudo[97513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:49 compute-0 python3.9[97515]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 20:59:50 compute-0 sudo[97513]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:51 compute-0 sudo[97666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plsdmjjgkymzpaiaxmgwseokvytbbnpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265991.259912-355-78458235255765/AnsiballZ_systemd.py'
Sep 30 20:59:51 compute-0 sudo[97666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 20:59:52 compute-0 python3.9[97668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 20:59:52 compute-0 sudo[97666]: pam_unix(sudo:session): session closed for user root
Sep 30 20:59:53 compute-0 python3.9[97821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:53 compute-0 python3.9[97942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265992.5474346-379-83133866151772/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:54 compute-0 python3.9[98092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:55 compute-0 python3.9[98213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265993.9625502-379-156326849406887/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:56 compute-0 python3.9[98363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:57 compute-0 ovn_controller[94912]: 2025-09-30T20:59:57Z|00025|memory|INFO|16256 kB peak resident set size after 29.9 seconds
Sep 30 20:59:57 compute-0 ovn_controller[94912]: 2025-09-30T20:59:57Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Sep 30 20:59:57 compute-0 podman[98458]: 2025-09-30 20:59:57.067050761 +0000 UTC m=+0.103777136 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 20:59:57 compute-0 python3.9[98495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265996.106922-511-217447618890473/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:57 compute-0 python3.9[98661]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 20:59:58 compute-0 python3.9[98782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759265997.3415983-511-38242144326768/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=3fd0bbe67f8d6b170421a2b4395a288aa69eaea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 20:59:59 compute-0 python3.9[98932]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:00:00 compute-0 sudo[99084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgtvncmgvlyxeqxxhisijbmefkjngtpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759265999.6741838-625-102614066336038/AnsiballZ_file.py'
Sep 30 21:00:00 compute-0 sudo[99084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:00 compute-0 python3.9[99086]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:00 compute-0 sudo[99084]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:00 compute-0 sudo[99236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cetettxeeryaepwadoqrfllkyisumhyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266000.5003674-649-85770886390390/AnsiballZ_stat.py'
Sep 30 21:00:00 compute-0 sudo[99236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:01 compute-0 python3.9[99238]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:01 compute-0 sudo[99236]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:01 compute-0 sudo[99314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjlbyyboinpijeovqsxzhisjnzeqvmhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266000.5003674-649-85770886390390/AnsiballZ_file.py'
Sep 30 21:00:01 compute-0 sudo[99314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:01 compute-0 python3.9[99316]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:01 compute-0 sudo[99314]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:02 compute-0 sudo[99466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxxlsowtyieuuwiextubrrlkmcvwoxkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266001.747036-649-47135327168897/AnsiballZ_stat.py'
Sep 30 21:00:02 compute-0 sudo[99466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:02 compute-0 python3.9[99468]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:02 compute-0 sudo[99466]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:02 compute-0 sudo[99544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyfztlklbjueaaqwkygagsuekirmulcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266001.747036-649-47135327168897/AnsiballZ_file.py'
Sep 30 21:00:02 compute-0 sudo[99544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:02 compute-0 python3.9[99546]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:03 compute-0 sudo[99544]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:03 compute-0 sudo[99696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvlbjoqcdxroazwmposvbwyvzwdlyyyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.1789315-718-69669393118510/AnsiballZ_file.py'
Sep 30 21:00:03 compute-0 sudo[99696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:03 compute-0 python3.9[99698]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:03 compute-0 sudo[99696]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:04 compute-0 sudo[99848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agteodgnukymtpcyswtsqmzdahonrunn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.857637-742-271093557718004/AnsiballZ_stat.py'
Sep 30 21:00:04 compute-0 sudo[99848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:04 compute-0 python3.9[99850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:04 compute-0 sudo[99848]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:04 compute-0 sudo[99926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxqkxkugjvqacubgofnulevgjpgrddrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266003.857637-742-271093557718004/AnsiballZ_file.py'
Sep 30 21:00:04 compute-0 sudo[99926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:04 compute-0 python3.9[99928]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:04 compute-0 sudo[99926]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:05 compute-0 sudo[100078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubdartbzpenmtunsdvmluqeiofjuhnsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266005.1240196-778-114830084291892/AnsiballZ_stat.py'
Sep 30 21:00:05 compute-0 sudo[100078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:05 compute-0 python3.9[100080]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:05 compute-0 sudo[100078]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:05 compute-0 sudo[100156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iredqctnamiwrtgxmhsvotuwgjpsncbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266005.1240196-778-114830084291892/AnsiballZ_file.py'
Sep 30 21:00:05 compute-0 sudo[100156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:06 compute-0 python3.9[100158]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:06 compute-0 sudo[100156]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:06 compute-0 sudo[100308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elaytipmlkdldprrahgkibiejttihxjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266006.2817643-814-223817181014208/AnsiballZ_systemd.py'
Sep 30 21:00:06 compute-0 sudo[100308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:06 compute-0 python3.9[100310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:06 compute-0 systemd[1]: Reloading.
Sep 30 21:00:07 compute-0 systemd-rc-local-generator[100334]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:07 compute-0 systemd-sysv-generator[100339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:07 compute-0 sudo[100308]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:07 compute-0 sudo[100497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdyisxklogxeskfrqbugnhumzuwjeojf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266007.7153232-838-208296118033263/AnsiballZ_stat.py'
Sep 30 21:00:07 compute-0 sudo[100497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:08 compute-0 python3.9[100499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:08 compute-0 sudo[100497]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:08 compute-0 sudo[100575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytfzicgaekvlaysiaprreyyanszrdfby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266007.7153232-838-208296118033263/AnsiballZ_file.py'
Sep 30 21:00:08 compute-0 sudo[100575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:08 compute-0 python3.9[100577]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:08 compute-0 sudo[100575]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:09 compute-0 sudo[100727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvvktssmphhkiltjbunplboidrlrnli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266009.0447624-874-135197849535566/AnsiballZ_stat.py'
Sep 30 21:00:09 compute-0 sudo[100727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:09 compute-0 python3.9[100729]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:09 compute-0 sudo[100727]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:09 compute-0 sudo[100805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgckwhfieznywupgxhwmeehewdezrnqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266009.0447624-874-135197849535566/AnsiballZ_file.py'
Sep 30 21:00:09 compute-0 sudo[100805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:10 compute-0 python3.9[100807]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:10 compute-0 sudo[100805]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:10 compute-0 sudo[100957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsxglycqtuohkdysavpbyxwahhtspvpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266010.369421-910-68891849575515/AnsiballZ_systemd.py'
Sep 30 21:00:10 compute-0 sudo[100957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:10 compute-0 python3.9[100959]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:10 compute-0 systemd[1]: Reloading.
Sep 30 21:00:11 compute-0 systemd-sysv-generator[100985]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:11 compute-0 systemd-rc-local-generator[100979]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:11 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 21:00:11 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:00:11 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:00:11 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 21:00:11 compute-0 sudo[100957]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:13 compute-0 sudo[101150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpbjsexaqhqceavjwufoynrxfxpkojpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266012.8007333-940-257150295392414/AnsiballZ_file.py'
Sep 30 21:00:13 compute-0 sudo[101150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:13 compute-0 python3.9[101152]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:13 compute-0 sudo[101150]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:13 compute-0 sudo[101302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkotbrommcsfvbqpkvsvdzunbaczlpze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266013.6356266-964-143679774002256/AnsiballZ_stat.py'
Sep 30 21:00:13 compute-0 sudo[101302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:14 compute-0 python3.9[101304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:14 compute-0 sudo[101302]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:14 compute-0 sudo[101425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbegluljzuvgksfakihxthefoktswdkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266013.6356266-964-143679774002256/AnsiballZ_copy.py'
Sep 30 21:00:14 compute-0 sudo[101425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:14 compute-0 python3.9[101427]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266013.6356266-964-143679774002256/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:14 compute-0 sudo[101425]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:15 compute-0 sudo[101577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twqqwuqqxlkvesdynjcjehczjmwnbrgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266015.2030973-1015-249317643919432/AnsiballZ_file.py'
Sep 30 21:00:15 compute-0 sudo[101577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:15 compute-0 python3.9[101579]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:00:15 compute-0 sudo[101577]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:16 compute-0 sudo[101729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqrkscorzkuwfvwbvesmudymoiefyiaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266016.015018-1039-190246190850017/AnsiballZ_stat.py'
Sep 30 21:00:16 compute-0 sudo[101729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:16 compute-0 python3.9[101731]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:00:16 compute-0 sudo[101729]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:17 compute-0 sudo[101853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkufutgsemcqhpingnapnxvvgrbrmcgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266016.015018-1039-190246190850017/AnsiballZ_copy.py'
Sep 30 21:00:17 compute-0 sudo[101853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:17 compute-0 python3.9[101855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266016.015018-1039-190246190850017/.source.json _original_basename=.ycj3aynn follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:17 compute-0 sudo[101853]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:17 compute-0 sudo[102006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wghxnxorslxvaswnqahnvfjgalgxeiwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266017.6525106-1084-131548862708683/AnsiballZ_file.py'
Sep 30 21:00:17 compute-0 sudo[102006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:18 compute-0 python3.9[102008]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:18 compute-0 sudo[102006]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:18 compute-0 sudo[102158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfusauraexqqagoltxxtrmjvtpeyutd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266018.4294078-1108-69750462735836/AnsiballZ_stat.py'
Sep 30 21:00:18 compute-0 sudo[102158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:18 compute-0 sudo[102158]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:19 compute-0 sudo[102281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajxdyglujkpdfajlpaftrpchyhllzenw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266018.4294078-1108-69750462735836/AnsiballZ_copy.py'
Sep 30 21:00:19 compute-0 sudo[102281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:19 compute-0 sudo[102281]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:20 compute-0 sudo[102433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urpqauynndzjpdhphhuevntnksyhxvxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266020.181218-1159-109760390484487/AnsiballZ_container_config_data.py'
Sep 30 21:00:20 compute-0 sudo[102433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:20 compute-0 python3.9[102435]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Sep 30 21:00:20 compute-0 sudo[102433]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:21 compute-0 sudo[102585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xogzkspspxfarcrqnwxjfybhcfrfnxwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266021.1707635-1186-87265388825657/AnsiballZ_container_config_hash.py'
Sep 30 21:00:21 compute-0 sudo[102585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:21 compute-0 python3.9[102587]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:00:21 compute-0 sudo[102585]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:22 compute-0 sudo[102737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjeorrqkobaqqlarrnprsqgyatndmifs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266022.1447997-1213-8412012202409/AnsiballZ_podman_container_info.py'
Sep 30 21:00:22 compute-0 sudo[102737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:22 compute-0 python3.9[102739]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:00:23 compute-0 sudo[102737]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:24 compute-0 sudo[102912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykphjjmlursnvmvruvaeeafldhupxjol ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266023.7982128-1252-110287319992062/AnsiballZ_edpm_container_manage.py'
Sep 30 21:00:24 compute-0 sudo[102912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:24 compute-0 python3[102914]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:00:29 compute-0 podman[102969]: 2025-09-30 21:00:29.943422654 +0000 UTC m=+2.667424116 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:00:31 compute-0 podman[102926]: 2025-09-30 21:00:31.055367103 +0000 UTC m=+6.280942437 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-0 podman[103048]: 2025-09-30 21:00:31.199227486 +0000 UTC m=+0.048176447 container create b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible)
Sep 30 21:00:31 compute-0 podman[103048]: 2025-09-30 21:00:31.176140661 +0000 UTC m=+0.025089662 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-0 python3[102914]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:00:31 compute-0 sudo[102912]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:32 compute-0 sudo[103236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exoqkdlrmsvsdlfuzlepozifipetfxgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266032.52975-1276-27502918217645/AnsiballZ_stat.py'
Sep 30 21:00:32 compute-0 sudo[103236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:33 compute-0 python3.9[103238]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:00:33 compute-0 sudo[103236]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:33 compute-0 sudo[103390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cktqdlwmdzwhfsodyhgsqcxinuxxdvel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266033.345077-1303-111601875711113/AnsiballZ_file.py'
Sep 30 21:00:33 compute-0 sudo[103390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:33 compute-0 python3.9[103392]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:33 compute-0 sudo[103390]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:34 compute-0 sudo[103466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyshksfstmxkqxhmugjdbsxchapguexw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266033.345077-1303-111601875711113/AnsiballZ_stat.py'
Sep 30 21:00:34 compute-0 sudo[103466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:34 compute-0 python3.9[103468]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:00:34 compute-0 sudo[103466]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:34 compute-0 sudo[103617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abiypnqlmpohaxmuygvlmpgszovxqdxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.3454492-1303-128479154582208/AnsiballZ_copy.py'
Sep 30 21:00:34 compute-0 sudo[103617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:34 compute-0 python3.9[103619]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266034.3454492-1303-128479154582208/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:00:34 compute-0 sudo[103617]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:35 compute-0 sudo[103693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnqwasvufwewekrzvdtygnjlnacvhbeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.3454492-1303-128479154582208/AnsiballZ_systemd.py'
Sep 30 21:00:35 compute-0 sudo[103693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:35 compute-0 python3.9[103695]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:00:35 compute-0 systemd[1]: Reloading.
Sep 30 21:00:35 compute-0 systemd-rc-local-generator[103723]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:35 compute-0 systemd-sysv-generator[103726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:35 compute-0 sudo[103693]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:36 compute-0 sudo[103804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bajhcjfxkeudvfdarensevjbimagwcoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266034.3454492-1303-128479154582208/AnsiballZ_systemd.py'
Sep 30 21:00:36 compute-0 sudo[103804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:36 compute-0 python3.9[103806]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:36 compute-0 systemd[1]: Reloading.
Sep 30 21:00:36 compute-0 systemd-sysv-generator[103838]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:36 compute-0 systemd-rc-local-generator[103834]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:36 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Sep 30 21:00:36 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:00:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/799b463dc77d249a9eb10a3f8b220f4edfd4653ee4ba05b61dc4e9146bc770f9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Sep 30 21:00:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/799b463dc77d249a9eb10a3f8b220f4edfd4653ee4ba05b61dc4e9146bc770f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:00:36 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84.
Sep 30 21:00:36 compute-0 podman[103846]: 2025-09-30 21:00:36.811000381 +0000 UTC m=+0.114240227 container init b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + sudo -E kolla_set_configs
Sep 30 21:00:36 compute-0 podman[103846]: 2025-09-30 21:00:36.833155044 +0000 UTC m=+0.136394890 container start b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:00:36 compute-0 edpm-start-podman-container[103846]: ovn_metadata_agent
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Validating config file
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Copying service configuration files
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Writing out command to execute
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/external
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Sep 30 21:00:36 compute-0 podman[103869]: 2025-09-30 21:00:36.896317058 +0000 UTC m=+0.053472840 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: ++ cat /run_command
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + CMD=neutron-ovn-metadata-agent
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + ARGS=
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + sudo kolla_copy_cacerts
Sep 30 21:00:36 compute-0 edpm-start-podman-container[103845]: Creating additional drop-in dependency for "ovn_metadata_agent" (b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84)
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + [[ ! -n '' ]]
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + . kolla_extend_start
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: Running command: 'neutron-ovn-metadata-agent'
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + umask 0022
Sep 30 21:00:36 compute-0 ovn_metadata_agent[103862]: + exec neutron-ovn-metadata-agent
Sep 30 21:00:36 compute-0 systemd[1]: Reloading.
Sep 30 21:00:37 compute-0 systemd-sysv-generator[103938]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:37 compute-0 systemd-rc-local-generator[103935]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:37 compute-0 systemd[1]: Started ovn_metadata_agent container.
Sep 30 21:00:37 compute-0 sudo[103804]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:38 compute-0 sshd-session[95515]: Connection closed by 192.168.122.30 port 52612
Sep 30 21:00:38 compute-0 sshd-session[95512]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:00:38 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Sep 30 21:00:38 compute-0 systemd[1]: session-22.scope: Consumed 51.294s CPU time.
Sep 30 21:00:38 compute-0 systemd-logind[792]: Session 22 logged out. Waiting for processes to exit.
Sep 30 21:00:38 compute-0 systemd-logind[792]: Removed session 22.
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.651 103867 INFO neutron.common.config [-] Logging enabled!
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.652 103867 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.652 103867 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.652 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.653 103867 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.654 103867 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.655 103867 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.656 103867 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.657 103867 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.658 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.659 103867 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.660 103867 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.661 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.662 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.663 103867 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.664 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.665 103867 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.666 103867 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.667 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.668 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.669 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.670 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.671 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.672 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.673 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.674 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.675 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.676 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.677 103867 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.678 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.679 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.680 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.681 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.682 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.683 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.684 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.685 103867 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.686 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.687 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.687 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.687 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.687 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.687 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.688 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.689 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.690 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.691 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.692 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.693 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.694 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.695 103867 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.741 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.741 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.741 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.741 103867 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.742 103867 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.758 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 3b817c7f-1137-4e8f-8263-8c5e6eddafa4 (UUID: 3b817c7f-1137-4e8f-8263-8c5e6eddafa4) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.786 103867 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.787 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.787 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.787 103867 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.790 103867 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.801 103867 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.809 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '3b817c7f-1137-4e8f-8263-8c5e6eddafa4'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], external_ids={}, name=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, nb_cfg_timestamp=1759265975200, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.811 103867 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe0bd719bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.812 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.812 103867 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.812 103867 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.812 103867 INFO oslo_service.service [-] Starting 1 workers
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.818 103867 DEBUG oslo_service.service [-] Started child 103975 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.822 103867 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpwuq6krd5/privsep.sock']
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.824 103975 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-958125'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.856 103975 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.856 103975 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.856 103975 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.860 103975 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.868 103975 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 21:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:38.876 103975 INFO eventlet.wsgi.server [-] (103975) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Sep 30 21:00:39 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.561 103867 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.561 103867 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwuq6krd5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.429 103980 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.434 103980 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.435 103980 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.436 103980 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103980
Sep 30 21:00:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:39.564 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[87d694ff-c2cb-4d7b-a56b-1c55de4e931c]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.013 103980 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.013 103980 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.013 103980 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.546 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1d0631-7685-4668-b49d-20427dc7f731]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.548 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, column=external_ids, values=({'neutron:ovn-metadata-id': '25070b9f-3f72-5ccc-b24c-bc64fea6afa5'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.557 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.564 103867 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.564 103867 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.564 103867 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.564 103867 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.565 103867 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.566 103867 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.567 103867 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.568 103867 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.569 103867 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.570 103867 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.571 103867 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.572 103867 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-cell1-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.573 103867 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.574 103867 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.575 103867 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.576 103867 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.577 103867 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.578 103867 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.579 103867 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.580 103867 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.581 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.582 103867 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.583 103867 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.584 103867 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.585 103867 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.586 103867 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.587 103867 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.588 103867 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.589 103867 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.590 103867 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.591 103867 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.592 103867 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.593 103867 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.594 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.595 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.596 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.597 103867 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.598 103867 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.598 103867 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:00:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:00:40.598 103867 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:00:43 compute-0 sshd-session[103985]: Accepted publickey for zuul from 192.168.122.30 port 35884 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:00:43 compute-0 systemd-logind[792]: New session 23 of user zuul.
Sep 30 21:00:43 compute-0 systemd[1]: Started Session 23 of User zuul.
Sep 30 21:00:43 compute-0 sshd-session[103985]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:00:44 compute-0 python3.9[104138]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:00:45 compute-0 sudo[104292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywihzsupvrckilfwwesjczpdnhngzrrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266045.395344-67-173965283645010/AnsiballZ_command.py'
Sep 30 21:00:45 compute-0 sudo[104292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:46 compute-0 python3.9[104294]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:00:46 compute-0 sudo[104292]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:47 compute-0 sudo[104457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfhvfinanijdpxknvuvbqvmoirxyvyss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266046.710975-100-190744723373381/AnsiballZ_systemd_service.py'
Sep 30 21:00:47 compute-0 sudo[104457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:47 compute-0 python3.9[104459]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:00:47 compute-0 systemd[1]: Reloading.
Sep 30 21:00:47 compute-0 systemd-sysv-generator[104488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:00:47 compute-0 systemd-rc-local-generator[104485]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:00:47 compute-0 sudo[104457]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:48 compute-0 python3.9[104645]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:00:49 compute-0 network[104662]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:00:49 compute-0 network[104663]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:00:49 compute-0 network[104664]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:00:57 compute-0 sudo[104926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshumvylairyyxqpxchebllpwujtueqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266057.511682-157-1614422742313/AnsiballZ_systemd_service.py'
Sep 30 21:00:57 compute-0 sudo[104926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:58 compute-0 python3.9[104928]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:58 compute-0 sudo[104926]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:58 compute-0 sudo[105079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfyeznnugisrcfhzmpqvazstpmjhvyio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266058.444061-157-248775768318795/AnsiballZ_systemd_service.py'
Sep 30 21:00:58 compute-0 sudo[105079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:59 compute-0 python3.9[105081]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:59 compute-0 sudo[105079]: pam_unix(sudo:session): session closed for user root
Sep 30 21:00:59 compute-0 sudo[105232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbfgpujiqreoqhrxvqvzcjqcqctobnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266059.2995992-157-260447919265332/AnsiballZ_systemd_service.py'
Sep 30 21:00:59 compute-0 sudo[105232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:00:59 compute-0 python3.9[105234]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:00:59 compute-0 sudo[105232]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:00 compute-0 sudo[105385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdioeqceerynrzmktwgwpwivrsppjoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266060.072311-157-8804573242503/AnsiballZ_systemd_service.py'
Sep 30 21:01:00 compute-0 sudo[105385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:00 compute-0 python3.9[105387]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:00 compute-0 sudo[105385]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:01 compute-0 sudo[105547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozryitjovxtpcqqqmyaqdzbcgiykyuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266060.931383-157-67280970152908/AnsiballZ_systemd_service.py'
Sep 30 21:01:01 compute-0 sudo[105547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:01 compute-0 podman[105512]: 2025-09-30 21:01:01.350360222 +0000 UTC m=+0.132132202 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:01:01 compute-0 python3.9[105554]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:01 compute-0 sudo[105547]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:01 compute-0 CROND[105647]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 21:01:01 compute-0 run-parts[105656]: (/etc/cron.hourly) starting 0anacron
Sep 30 21:01:02 compute-0 anacron[105679]: Anacron started on 2025-09-30
Sep 30 21:01:02 compute-0 anacron[105679]: Will run job `cron.daily' in 48 min.
Sep 30 21:01:02 compute-0 anacron[105679]: Will run job `cron.weekly' in 68 min.
Sep 30 21:01:02 compute-0 anacron[105679]: Will run job `cron.monthly' in 88 min.
Sep 30 21:01:02 compute-0 anacron[105679]: Jobs will be executed sequentially
Sep 30 21:01:02 compute-0 run-parts[105683]: (/etc/cron.hourly) finished 0anacron
Sep 30 21:01:02 compute-0 CROND[105646]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 21:01:02 compute-0 sudo[105734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eanwblhdsykepayfeyzjvkmliuctncrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266061.8374388-157-31121876711478/AnsiballZ_systemd_service.py'
Sep 30 21:01:02 compute-0 sudo[105734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:02 compute-0 python3.9[105736]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:03 compute-0 sudo[105734]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:04 compute-0 sudo[105887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nanbedjqtbnrceniuygjamfrwyqdtbhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266063.828481-157-142154352775247/AnsiballZ_systemd_service.py'
Sep 30 21:01:04 compute-0 sudo[105887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:04 compute-0 python3.9[105889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:01:04 compute-0 sudo[105887]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:05 compute-0 sudo[106040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ookxmmmxoxlrwrynykhzholxtbfoqusx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266065.036631-313-192840765821271/AnsiballZ_file.py'
Sep 30 21:01:05 compute-0 sudo[106040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:05 compute-0 python3.9[106042]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:05 compute-0 sudo[106040]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:06 compute-0 sudo[106192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixtyjcgaoqfdfpuccgwtekttoqcrrkxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266065.8675141-313-81355370446812/AnsiballZ_file.py'
Sep 30 21:01:06 compute-0 sudo[106192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:06 compute-0 python3.9[106194]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:06 compute-0 sudo[106192]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:07 compute-0 sudo[106354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmnpififwdzqwjeqqrzswflcvgkylgjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266066.734778-313-151881595539577/AnsiballZ_file.py'
Sep 30 21:01:07 compute-0 sudo[106354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:07 compute-0 podman[106318]: 2025-09-30 21:01:07.079224149 +0000 UTC m=+0.055034207 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:01:07 compute-0 python3.9[106363]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:07 compute-0 sudo[106354]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:07 compute-0 sudo[106515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmerhlxpbxchtiogzsygnvuuienxglsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266067.4185762-313-197589271831805/AnsiballZ_file.py'
Sep 30 21:01:07 compute-0 sudo[106515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:07 compute-0 python3.9[106517]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:07 compute-0 sudo[106515]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:08 compute-0 sudo[106667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taxnusixmyvvjfezbrrunyuvzhwmtcem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266068.0604918-313-7219139889041/AnsiballZ_file.py'
Sep 30 21:01:08 compute-0 sudo[106667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:08 compute-0 python3.9[106669]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:08 compute-0 sudo[106667]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:09 compute-0 sudo[106819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udcczszeirtlrcelvopxyyonkeeyfase ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266068.7089062-313-105564948317851/AnsiballZ_file.py'
Sep 30 21:01:09 compute-0 sudo[106819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:09 compute-0 python3.9[106821]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:09 compute-0 sudo[106819]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:09 compute-0 sudo[106971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lznjdzdseylxqimuvuhksekqsfidzgnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266069.4239426-313-76474059342283/AnsiballZ_file.py'
Sep 30 21:01:09 compute-0 sudo[106971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:09 compute-0 python3.9[106973]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:09 compute-0 sudo[106971]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:10 compute-0 sudo[107123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyhndahhevsuegovfelhpftddgytgfjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266070.2299178-463-27659919628478/AnsiballZ_file.py'
Sep 30 21:01:10 compute-0 sudo[107123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:10 compute-0 python3.9[107125]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:10 compute-0 sudo[107123]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:11 compute-0 sudo[107275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayuiquounoidxzeogfznefunkutsatxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266070.933296-463-125136367139577/AnsiballZ_file.py'
Sep 30 21:01:11 compute-0 sudo[107275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:11 compute-0 python3.9[107277]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:11 compute-0 sudo[107275]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:11 compute-0 sudo[107427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihrwodkpflbdfiykjimyfikrdoiqrevx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266071.590669-463-109957271774553/AnsiballZ_file.py'
Sep 30 21:01:11 compute-0 sudo[107427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:12 compute-0 python3.9[107429]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:12 compute-0 sudo[107427]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:12 compute-0 sudo[107579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwrgfxsmyrjgghfdbpjpybrovvhzxmzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266072.270599-463-86385486162705/AnsiballZ_file.py'
Sep 30 21:01:12 compute-0 sudo[107579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:12 compute-0 python3.9[107581]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:12 compute-0 sudo[107579]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:13 compute-0 sudo[107731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kebbudmgcnzwgnptozdxlvstuhiytpbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266072.990444-463-241181110514456/AnsiballZ_file.py'
Sep 30 21:01:13 compute-0 sudo[107731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:13 compute-0 python3.9[107733]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:13 compute-0 sudo[107731]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:13 compute-0 sudo[107883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whbtklsbqlqevnfkrlxtqiotgmnnpoux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266073.6580224-463-223465388955372/AnsiballZ_file.py'
Sep 30 21:01:13 compute-0 sudo[107883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:14 compute-0 python3.9[107885]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:14 compute-0 sudo[107883]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:14 compute-0 sudo[108035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roxiahuvnamxfwvpbmwaofmevarxakus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266074.3517568-463-230785905397850/AnsiballZ_file.py'
Sep 30 21:01:14 compute-0 sudo[108035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:14 compute-0 python3.9[108037]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:01:14 compute-0 sudo[108035]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:15 compute-0 sudo[108187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwdzljcofwjhhecnyfhojmjbxxgsegqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266075.4709506-616-234274192178498/AnsiballZ_command.py'
Sep 30 21:01:15 compute-0 sudo[108187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:15 compute-0 python3.9[108189]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:15 compute-0 sudo[108187]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:16 compute-0 python3.9[108341]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:01:17 compute-0 sudo[108491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxlcepuxqxsrmtfmmiojtwpykpilgafc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266077.255434-670-159028314466423/AnsiballZ_systemd_service.py'
Sep 30 21:01:17 compute-0 sudo[108491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:17 compute-0 python3.9[108493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:01:17 compute-0 systemd[1]: Reloading.
Sep 30 21:01:17 compute-0 systemd-rc-local-generator[108519]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:01:17 compute-0 systemd-sysv-generator[108524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:01:18 compute-0 sudo[108491]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:18 compute-0 sudo[108677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvoumjlecnafnyqmltbdnbfplsnyoxdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266078.3312716-694-47460763054011/AnsiballZ_command.py'
Sep 30 21:01:18 compute-0 sudo[108677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:18 compute-0 python3.9[108679]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:18 compute-0 sudo[108677]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:19 compute-0 sudo[108830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfvdgignhjbmybxmqnzlsefwgiuhrdcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266079.0570762-694-198098218210224/AnsiballZ_command.py'
Sep 30 21:01:19 compute-0 sudo[108830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:19 compute-0 python3.9[108832]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:19 compute-0 sudo[108830]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:20 compute-0 sudo[108983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvwagsfaemcqxfwfobrlsxthtaraxlff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266079.7546668-694-258938086979360/AnsiballZ_command.py'
Sep 30 21:01:20 compute-0 sudo[108983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:20 compute-0 python3.9[108985]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:20 compute-0 sudo[108983]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:20 compute-0 sudo[109136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtrbhgfnpccdvmpasghtvxeeapbtyqjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266080.417491-694-263944685638391/AnsiballZ_command.py'
Sep 30 21:01:20 compute-0 sudo[109136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:20 compute-0 python3.9[109138]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:20 compute-0 sudo[109136]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:21 compute-0 sudo[109289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liwumguukrirupqykdndxwhdipqrrplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266081.1412103-694-88641698507198/AnsiballZ_command.py'
Sep 30 21:01:21 compute-0 sudo[109289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:21 compute-0 python3.9[109291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:21 compute-0 sudo[109289]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:22 compute-0 sudo[109442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anmdqmogxytrrrssxwzjlxvzoecgopkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266081.8597682-694-257433269274387/AnsiballZ_command.py'
Sep 30 21:01:22 compute-0 sudo[109442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:22 compute-0 python3.9[109444]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:23 compute-0 sudo[109442]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:23 compute-0 sudo[109595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgeylgfpngwhojduqxcosksxfbqujvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266083.5869453-694-98139003124425/AnsiballZ_command.py'
Sep 30 21:01:23 compute-0 sudo[109595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:24 compute-0 python3.9[109597]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:01:24 compute-0 sudo[109595]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:25 compute-0 sudo[109749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpxvyucovrffuxdeynlohofdjoblrzjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266084.699847-856-73881345080380/AnsiballZ_getent.py'
Sep 30 21:01:25 compute-0 sudo[109749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:25 compute-0 python3.9[109751]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Sep 30 21:01:25 compute-0 sudo[109749]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:26 compute-0 sudo[109903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nscjzvoqdkkxwiukdqiqhzfkvdztxqls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266085.6390386-880-12053680661054/AnsiballZ_group.py'
Sep 30 21:01:26 compute-0 sudo[109903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:26 compute-0 python3.9[109905]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:01:26 compute-0 groupadd[109906]: group added to /etc/group: name=libvirt, GID=42473
Sep 30 21:01:26 compute-0 groupadd[109906]: group added to /etc/gshadow: name=libvirt
Sep 30 21:01:26 compute-0 groupadd[109906]: new group: name=libvirt, GID=42473
Sep 30 21:01:26 compute-0 sudo[109903]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:27 compute-0 sudo[110061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vczjswzkzyyypzmooucjbgqgyvbacsfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266086.784164-904-174624571519242/AnsiballZ_user.py'
Sep 30 21:01:27 compute-0 sudo[110061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:27 compute-0 python3.9[110063]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:01:27 compute-0 useradd[110065]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 21:01:27 compute-0 sudo[110061]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:28 compute-0 sudo[110221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnpximlefjccsjiyztipemknvpzshkqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266088.1012046-937-249157729226782/AnsiballZ_setup.py'
Sep 30 21:01:28 compute-0 sudo[110221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:28 compute-0 python3.9[110223]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 21:01:28 compute-0 sudo[110221]: pam_unix(sudo:session): session closed for user root
Sep 30 21:01:29 compute-0 sudo[110305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuphivucfvsiwkrsbqplkioqupmpdrhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266088.1012046-937-249157729226782/AnsiballZ_dnf.py'
Sep 30 21:01:29 compute-0 sudo[110305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:01:29 compute-0 python3.9[110307]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 21:01:32 compute-0 podman[110311]: 2025-09-30 21:01:32.400771969 +0000 UTC m=+0.124887548 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 21:01:37 compute-0 podman[110339]: 2025-09-30 21:01:37.346687968 +0000 UTC m=+0.078319447 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:01:38.697 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:01:38.698 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:01:38.698 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:02:03 compute-0 podman[110546]: 2025-09-30 21:02:03.399357702 +0000 UTC m=+0.122135203 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:02:04 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:02:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:02:08 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Sep 30 21:02:08 compute-0 podman[110580]: 2025-09-30 21:02:08.359323262 +0000 UTC m=+0.081421093 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:02:13 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:02:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:02:16 compute-0 sshd[1008]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 101802
Sep 30 21:02:34 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Sep 30 21:02:34 compute-0 podman[115822]: 2025-09-30 21:02:34.418624116 +0000 UTC m=+0.148160279 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:02:38.699 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:02:38.700 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:02:38.700 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:02:39 compute-0 podman[118761]: 2025-09-30 21:02:39.361734867 +0000 UTC m=+0.089937233 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:03:05 compute-0 podman[127399]: 2025-09-30 21:03:05.382665062 +0000 UTC m=+0.113953775 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:03:08 compute-0 kernel: SELinux:  Converting 2753 SID table entries...
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 21:03:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 21:03:09 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Sep 30 21:03:09 compute-0 groupadd[127439]: group added to /etc/group: name=dnsmasq, GID=992
Sep 30 21:03:09 compute-0 groupadd[127439]: group added to /etc/gshadow: name=dnsmasq
Sep 30 21:03:09 compute-0 groupadd[127439]: new group: name=dnsmasq, GID=992
Sep 30 21:03:09 compute-0 useradd[127461]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Sep 30 21:03:09 compute-0 podman[127438]: 2025-09-30 21:03:09.464324062 +0000 UTC m=+0.051407411 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 21:03:09 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 21:03:09 compute-0 dbus-broker-launch[760]: Noticed file-system modification, trigger reload.
Sep 30 21:03:10 compute-0 groupadd[127476]: group added to /etc/group: name=clevis, GID=991
Sep 30 21:03:10 compute-0 groupadd[127476]: group added to /etc/gshadow: name=clevis
Sep 30 21:03:10 compute-0 groupadd[127476]: new group: name=clevis, GID=991
Sep 30 21:03:10 compute-0 useradd[127483]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Sep 30 21:03:10 compute-0 usermod[127493]: add 'clevis' to group 'tss'
Sep 30 21:03:10 compute-0 usermod[127493]: add 'clevis' to shadow group 'tss'
Sep 30 21:03:12 compute-0 polkitd[6535]: Reloading rules
Sep 30 21:03:12 compute-0 polkitd[6535]: Collecting garbage unconditionally...
Sep 30 21:03:12 compute-0 polkitd[6535]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 21:03:12 compute-0 polkitd[6535]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 21:03:12 compute-0 polkitd[6535]: Finished loading, compiling and executing 4 rules
Sep 30 21:03:12 compute-0 polkitd[6535]: Reloading rules
Sep 30 21:03:12 compute-0 polkitd[6535]: Collecting garbage unconditionally...
Sep 30 21:03:12 compute-0 polkitd[6535]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 21:03:12 compute-0 polkitd[6535]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 21:03:12 compute-0 polkitd[6535]: Finished loading, compiling and executing 4 rules
Sep 30 21:03:14 compute-0 groupadd[127680]: group added to /etc/group: name=ceph, GID=167
Sep 30 21:03:14 compute-0 groupadd[127680]: group added to /etc/gshadow: name=ceph
Sep 30 21:03:14 compute-0 groupadd[127680]: new group: name=ceph, GID=167
Sep 30 21:03:14 compute-0 useradd[127686]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Sep 30 21:03:17 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Sep 30 21:03:17 compute-0 sshd[1008]: Received signal 15; terminating.
Sep 30 21:03:17 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Sep 30 21:03:17 compute-0 systemd[1]: sshd.service: Unit process 109745 (sshd-session) remains running after unit stopped.
Sep 30 21:03:17 compute-0 systemd[1]: sshd.service: Unit process 109829 (sshd-session) remains running after unit stopped.
Sep 30 21:03:17 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Sep 30 21:03:17 compute-0 systemd[1]: sshd.service: Consumed 9.859s CPU time, 17.0M memory peak, read 0B from disk, written 120.0K to disk.
Sep 30 21:03:17 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Sep 30 21:03:17 compute-0 systemd[1]: Stopping sshd-keygen.target...
Sep 30 21:03:17 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:17 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:17 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 21:03:17 compute-0 systemd[1]: Reached target sshd-keygen.target.
Sep 30 21:03:17 compute-0 systemd[1]: Starting OpenSSH server daemon...
Sep 30 21:03:17 compute-0 sshd[128205]: Server listening on 0.0.0.0 port 22.
Sep 30 21:03:17 compute-0 sshd[128205]: Server listening on :: port 22.
Sep 30 21:03:17 compute-0 systemd[1]: Started OpenSSH server daemon.
Sep 30 21:03:19 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 21:03:19 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 21:03:19 compute-0 systemd[1]: Reloading.
Sep 30 21:03:19 compute-0 systemd-rc-local-generator[128464]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:19 compute-0 systemd-sysv-generator[128469]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:20 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 21:03:21 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 21:03:21 compute-0 PackageKit[130622]: daemon start
Sep 30 21:03:21 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 21:03:22 compute-0 sudo[110305]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:23 compute-0 sudo[132265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsvmuwfabppvepqnguyxyjtixolbojmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266202.5399354-973-108473709272337/AnsiballZ_systemd.py'
Sep 30 21:03:23 compute-0 sudo[132265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:23 compute-0 python3.9[132294]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:23 compute-0 systemd[1]: Reloading.
Sep 30 21:03:23 compute-0 systemd-rc-local-generator[132727]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:23 compute-0 systemd-sysv-generator[132733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:23 compute-0 sudo[132265]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:24 compute-0 sudo[133559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhdmosfakgwpiyryrxrctlyplxbwmmxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266204.0305488-973-26438060218784/AnsiballZ_systemd.py'
Sep 30 21:03:24 compute-0 sudo[133559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:24 compute-0 python3.9[133588]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:24 compute-0 systemd[1]: Reloading.
Sep 30 21:03:24 compute-0 systemd-sysv-generator[134096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:24 compute-0 systemd-rc-local-generator[134092]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:24 compute-0 sudo[133559]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:25 compute-0 sudo[134828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqeofpmnqydnpdcpuhhxrctetzwcvxch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266205.151889-973-114875249121062/AnsiballZ_systemd.py'
Sep 30 21:03:25 compute-0 sudo[134828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:25 compute-0 python3.9[134851]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:25 compute-0 systemd[1]: Reloading.
Sep 30 21:03:25 compute-0 systemd-rc-local-generator[135225]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:25 compute-0 systemd-sysv-generator[135232]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:26 compute-0 sudo[134828]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:26 compute-0 sudo[135983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fibkrluyhciuvjrfmonvqqkbmapjtgqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266206.26101-973-201678440972553/AnsiballZ_systemd.py'
Sep 30 21:03:26 compute-0 sudo[135983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:26 compute-0 python3.9[136000]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:26 compute-0 systemd[1]: Reloading.
Sep 30 21:03:26 compute-0 systemd-rc-local-generator[136425]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:26 compute-0 systemd-sysv-generator[136430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:27 compute-0 sudo[135983]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 21:03:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 21:03:28 compute-0 systemd[1]: man-db-cache-update.service: Consumed 11.148s CPU time.
Sep 30 21:03:28 compute-0 systemd[1]: run-r89a18af477ee46eaae8762c258ba10ba.service: Deactivated successfully.
Sep 30 21:03:30 compute-0 sudo[137623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yslqgzlhqtktisodqmlbnpnzdokfqbud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266209.7270358-1060-68184886048912/AnsiballZ_systemd.py'
Sep 30 21:03:30 compute-0 sudo[137623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:30 compute-0 python3.9[137625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:30 compute-0 systemd[1]: Reloading.
Sep 30 21:03:30 compute-0 systemd-rc-local-generator[137655]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:30 compute-0 systemd-sysv-generator[137658]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:30 compute-0 sudo[137623]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:31 compute-0 sudo[137812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sveczvqlplbhkdrnuvjgmiinaadudmnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266210.9209116-1060-238058311816072/AnsiballZ_systemd.py'
Sep 30 21:03:31 compute-0 sudo[137812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:31 compute-0 python3.9[137814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:31 compute-0 systemd[1]: Reloading.
Sep 30 21:03:31 compute-0 systemd-sysv-generator[137847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:31 compute-0 systemd-rc-local-generator[137843]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:31 compute-0 sudo[137812]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:32 compute-0 sudo[138002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttilatwmydcvtkyefsvnxcyyqvfnltff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266212.0261545-1060-21956951152738/AnsiballZ_systemd.py'
Sep 30 21:03:32 compute-0 sudo[138002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:32 compute-0 python3.9[138004]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:32 compute-0 systemd[1]: Reloading.
Sep 30 21:03:32 compute-0 systemd-sysv-generator[138035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:32 compute-0 systemd-rc-local-generator[138031]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:33 compute-0 sudo[138002]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:33 compute-0 sudo[138192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rswdxcywkivvjvcezfslspsdjzppstfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266213.2941535-1060-92715631938581/AnsiballZ_systemd.py'
Sep 30 21:03:33 compute-0 sudo[138192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:33 compute-0 python3.9[138194]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:34 compute-0 sudo[138192]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:34 compute-0 sudo[138347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whpdadvqnyavojgsfyorkupwvessruxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266214.2342484-1060-200849398449359/AnsiballZ_systemd.py'
Sep 30 21:03:34 compute-0 sudo[138347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:34 compute-0 python3.9[138349]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:35 compute-0 systemd[1]: Reloading.
Sep 30 21:03:35 compute-0 systemd-rc-local-generator[138381]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:35 compute-0 systemd-sysv-generator[138386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:35 compute-0 sudo[138347]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:36 compute-0 podman[138419]: 2025-09-30 21:03:36.384338195 +0000 UTC m=+0.108478250 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:03:36 compute-0 sudo[138563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sylqksjxujvfgyphyhpirxrukgmyguev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266216.270672-1168-39916563625449/AnsiballZ_systemd.py'
Sep 30 21:03:36 compute-0 sudo[138563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:36 compute-0 python3.9[138565]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 21:03:37 compute-0 systemd[1]: Reloading.
Sep 30 21:03:37 compute-0 systemd-rc-local-generator[138595]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:03:37 compute-0 systemd-sysv-generator[138599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:03:37 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Sep 30 21:03:37 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Sep 30 21:03:37 compute-0 sudo[138563]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:37 compute-0 sudo[138756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zflcbktrqstwglmmozxdafvrfneptxar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266217.6009316-1192-46084809616690/AnsiballZ_systemd.py'
Sep 30 21:03:37 compute-0 sudo[138756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:38 compute-0 python3.9[138758]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:38 compute-0 sudo[138756]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:03:38.702 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:03:38.704 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:03:38.704 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:03:38 compute-0 sudo[138911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyjxuxmcnjvoovhewwcychwgssazyfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266218.5389822-1192-203930408618301/AnsiballZ_systemd.py'
Sep 30 21:03:38 compute-0 sudo[138911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:39 compute-0 python3.9[138913]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:39 compute-0 sudo[138911]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:39 compute-0 sudo[139077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzbhmjibhujchxcdxdsxpyaumrjcpcty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266219.4435093-1192-2163616092196/AnsiballZ_systemd.py'
Sep 30 21:03:39 compute-0 sudo[139077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:39 compute-0 podman[139040]: 2025-09-30 21:03:39.845539802 +0000 UTC m=+0.081659278 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:03:40 compute-0 python3.9[139085]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:40 compute-0 sudo[139077]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:40 compute-0 sudo[139240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmippdkylbzmnjzxuuudtxdknwetfakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266220.3906393-1192-63593928209859/AnsiballZ_systemd.py'
Sep 30 21:03:40 compute-0 sudo[139240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:41 compute-0 python3.9[139242]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:41 compute-0 sudo[139240]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:41 compute-0 sudo[139395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byesdoghtwzjjhramlguuruhgimehviq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266221.3671663-1192-139749462204432/AnsiballZ_systemd.py'
Sep 30 21:03:41 compute-0 sudo[139395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:42 compute-0 python3.9[139397]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:42 compute-0 sudo[139395]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:42 compute-0 sudo[139550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilkfonnzfffvznifuewyynbovbxhwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266222.3129234-1192-84341145028390/AnsiballZ_systemd.py'
Sep 30 21:03:42 compute-0 sudo[139550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:42 compute-0 python3.9[139552]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:43 compute-0 sudo[139550]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:43 compute-0 sudo[139705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dybiiuxnwbjiewheaqepvldwbclooigq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266223.170366-1192-209995533282387/AnsiballZ_systemd.py'
Sep 30 21:03:43 compute-0 sudo[139705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:43 compute-0 python3.9[139707]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:43 compute-0 sudo[139705]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:44 compute-0 sudo[139860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txqifybcwslutnyxzeoaxiqbkzvhvded ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266224.00531-1192-31013831289494/AnsiballZ_systemd.py'
Sep 30 21:03:44 compute-0 sudo[139860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:44 compute-0 python3.9[139862]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:44 compute-0 sudo[139860]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:45 compute-0 sudo[140015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqdqjbmgweducguideemaqglystuhywj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266224.8973365-1192-202650402142760/AnsiballZ_systemd.py'
Sep 30 21:03:45 compute-0 sudo[140015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:45 compute-0 python3.9[140017]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:45 compute-0 sudo[140015]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:46 compute-0 sudo[140170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okiychzfxkdokeebgozifhfxdngmbttr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266225.8807807-1192-107565705671599/AnsiballZ_systemd.py'
Sep 30 21:03:46 compute-0 sudo[140170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:46 compute-0 python3.9[140172]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:46 compute-0 sudo[140170]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:47 compute-0 sudo[140325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvvixiidainrqnwctkdknvmxxonydqnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266226.778002-1192-43710847235752/AnsiballZ_systemd.py'
Sep 30 21:03:47 compute-0 sudo[140325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:47 compute-0 python3.9[140327]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:47 compute-0 sudo[140325]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:48 compute-0 sudo[140480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdhftxsvroqftfslmjmutprtsqqpxohk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266227.646172-1192-15911095558360/AnsiballZ_systemd.py'
Sep 30 21:03:48 compute-0 sudo[140480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:48 compute-0 python3.9[140482]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:48 compute-0 sudo[140480]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:49 compute-0 sudo[140635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsnsaxkojjcacqmcljenmcrumiqhpehv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266228.6596575-1192-177207426179339/AnsiballZ_systemd.py'
Sep 30 21:03:49 compute-0 sudo[140635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:49 compute-0 python3.9[140637]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:49 compute-0 sudo[140635]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:50 compute-0 sudo[140790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xobqvpelhedcicrrriuamewgcthugkbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266229.6652963-1192-23762761937800/AnsiballZ_systemd.py'
Sep 30 21:03:50 compute-0 sudo[140790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:50 compute-0 python3.9[140792]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 21:03:50 compute-0 sudo[140790]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:51 compute-0 sudo[140945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfyaynxspayhkryybvwmurixxidmhohf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266230.997344-1498-46984028576402/AnsiballZ_file.py'
Sep 30 21:03:51 compute-0 sudo[140945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:51 compute-0 python3.9[140947]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:51 compute-0 sudo[140945]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:52 compute-0 sudo[141097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whbhsdibqcxtuzmghzpdqgbmbayhssjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266231.7836862-1498-239823420929965/AnsiballZ_file.py'
Sep 30 21:03:52 compute-0 sudo[141097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:52 compute-0 python3.9[141099]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:52 compute-0 sudo[141097]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:52 compute-0 sudo[141249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkgfdhpdejqgvsinbygnketbxgofwhvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266232.5934806-1498-113243646100377/AnsiballZ_file.py'
Sep 30 21:03:52 compute-0 sudo[141249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:53 compute-0 python3.9[141251]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:53 compute-0 sudo[141249]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:53 compute-0 sudo[141402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqvzvjbejosamhxqhpdovpyfcyjokcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266233.3661296-1498-165284750848136/AnsiballZ_file.py'
Sep 30 21:03:53 compute-0 sudo[141402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:53 compute-0 python3.9[141404]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:53 compute-0 sudo[141402]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:54 compute-0 sudo[141555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmwerfcnnehzbdpgtfekzkomqgoafsjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266234.2739627-1498-85909384725562/AnsiballZ_file.py'
Sep 30 21:03:54 compute-0 sudo[141555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:54 compute-0 python3.9[141557]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:54 compute-0 sudo[141555]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:54 compute-0 unix_chkpwd[141558]: password check failed for user (root)
Sep 30 21:03:54 compute-0 sshd-session[141252]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 21:03:55 compute-0 sudo[141708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skrsqgchdbxegowfrakvwimnsqwwdtaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266235.0934274-1498-249087931863472/AnsiballZ_file.py'
Sep 30 21:03:55 compute-0 sudo[141708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:55 compute-0 python3.9[141710]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:03:55 compute-0 sudo[141708]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:56 compute-0 sudo[141860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwkdppwogfbiankeqiwngjtxqwtjxrbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266235.895518-1627-71448551727530/AnsiballZ_stat.py'
Sep 30 21:03:56 compute-0 sudo[141860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:56 compute-0 python3.9[141862]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:56 compute-0 sudo[141860]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:56 compute-0 sshd-session[141252]: Failed password for root from 80.94.95.116 port 62974 ssh2
Sep 30 21:03:57 compute-0 sudo[141985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypwixoevlaxtdloivcylupgnlycljlpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266235.895518-1627-71448551727530/AnsiballZ_copy.py'
Sep 30 21:03:57 compute-0 sudo[141985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:57 compute-0 python3.9[141987]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266235.895518-1627-71448551727530/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:03:57 compute-0 sudo[141985]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:58 compute-0 sudo[142137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiwbmrvvsdnwochouxiwzxhlupylqyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266237.6731238-1627-127492423513027/AnsiballZ_stat.py'
Sep 30 21:03:58 compute-0 sudo[142137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:58 compute-0 python3.9[142139]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:58 compute-0 sudo[142137]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:58 compute-0 sshd-session[141252]: Connection closed by authenticating user root 80.94.95.116 port 62974 [preauth]
Sep 30 21:03:58 compute-0 sudo[142262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woigjdmhzkrnzhbokyksqpxqoozultrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266237.6731238-1627-127492423513027/AnsiballZ_copy.py'
Sep 30 21:03:58 compute-0 sudo[142262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:58 compute-0 python3.9[142264]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266237.6731238-1627-127492423513027/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:03:59 compute-0 sudo[142262]: pam_unix(sudo:session): session closed for user root
Sep 30 21:03:59 compute-0 sudo[142414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvagttihvgnwgxxwzjwwinrsanrhotvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266239.1717787-1627-79569664242023/AnsiballZ_stat.py'
Sep 30 21:03:59 compute-0 sudo[142414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:03:59 compute-0 python3.9[142416]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:03:59 compute-0 sudo[142414]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:00 compute-0 sudo[142539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqwtnoxhnwmymwwdclrukcbbsttpemcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266239.1717787-1627-79569664242023/AnsiballZ_copy.py'
Sep 30 21:04:00 compute-0 sudo[142539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:00 compute-0 python3.9[142541]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266239.1717787-1627-79569664242023/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:00 compute-0 sudo[142539]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:00 compute-0 sudo[142691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmpsnaztgphekprflpwfximnlrfiulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266240.6249094-1627-96803351729254/AnsiballZ_stat.py'
Sep 30 21:04:00 compute-0 sudo[142691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:01 compute-0 python3.9[142693]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:01 compute-0 sudo[142691]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:01 compute-0 sudo[142816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knzegdstgpwzirnkjwzqmoriaseyxrue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266240.6249094-1627-96803351729254/AnsiballZ_copy.py'
Sep 30 21:04:01 compute-0 sudo[142816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:01 compute-0 python3.9[142818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266240.6249094-1627-96803351729254/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:01 compute-0 sudo[142816]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:02 compute-0 sudo[142968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaclofcdcptndhzkqmyhztlpkppjpbuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266241.9357564-1627-181211280412786/AnsiballZ_stat.py'
Sep 30 21:04:02 compute-0 sudo[142968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:02 compute-0 python3.9[142970]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:02 compute-0 sudo[142968]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:02 compute-0 sudo[143093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwewwzzblmhghazqtwbjxupvmbtuydzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266241.9357564-1627-181211280412786/AnsiballZ_copy.py'
Sep 30 21:04:02 compute-0 sudo[143093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:03 compute-0 python3.9[143095]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266241.9357564-1627-181211280412786/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:03 compute-0 sudo[143093]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:03 compute-0 sudo[143245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zknxaybmlkaukcteliigvvyajpjdnvlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266243.3114693-1627-120284868821834/AnsiballZ_stat.py'
Sep 30 21:04:03 compute-0 sudo[143245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:03 compute-0 python3.9[143247]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:03 compute-0 sudo[143245]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:04 compute-0 sudo[143370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmeclydolcacdrckweuvzxtchnmiqiok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266243.3114693-1627-120284868821834/AnsiballZ_copy.py'
Sep 30 21:04:04 compute-0 sudo[143370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:04 compute-0 python3.9[143372]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266243.3114693-1627-120284868821834/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:04 compute-0 sudo[143370]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:05 compute-0 sudo[143522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtyrkzaqixgczabauzoyxcmwnombbmdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266244.7930558-1627-47385349021858/AnsiballZ_stat.py'
Sep 30 21:04:05 compute-0 sudo[143522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:05 compute-0 python3.9[143524]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:05 compute-0 sudo[143522]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:05 compute-0 sudo[143645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrglcnvxxjzodnojkiuwjyasoxjrhuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266244.7930558-1627-47385349021858/AnsiballZ_copy.py'
Sep 30 21:04:05 compute-0 sudo[143645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:06 compute-0 python3.9[143647]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266244.7930558-1627-47385349021858/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:06 compute-0 sudo[143645]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:06 compute-0 sudo[143807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtchnzttikzyderlklpftbrkiegxbpgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266246.245115-1627-94178600353482/AnsiballZ_stat.py'
Sep 30 21:04:06 compute-0 sudo[143807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:06 compute-0 podman[143771]: 2025-09-30 21:04:06.683173831 +0000 UTC m=+0.095836188 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:04:06 compute-0 python3.9[143819]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:06 compute-0 sudo[143807]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:07 compute-0 sudo[143949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpyynspgcuqcuzcpylpgrfklqgahsrqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266246.245115-1627-94178600353482/AnsiballZ_copy.py'
Sep 30 21:04:07 compute-0 sudo[143949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:07 compute-0 python3.9[143951]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759266246.245115-1627-94178600353482/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:07 compute-0 sudo[143949]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:08 compute-0 sudo[144101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcgmdbpxrfvggjrjqkahbbcrxxmtiueq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266248.0020888-1966-31717013484446/AnsiballZ_command.py'
Sep 30 21:04:08 compute-0 sudo[144101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:08 compute-0 python3.9[144103]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Sep 30 21:04:08 compute-0 sudo[144101]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:09 compute-0 sudo[144254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zucnnabiszrvucvomiyxxbgvfosjztzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266248.8541324-1993-25161092651955/AnsiballZ_file.py'
Sep 30 21:04:09 compute-0 sudo[144254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:09 compute-0 python3.9[144256]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:09 compute-0 sudo[144254]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:09 compute-0 sudo[144406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unspdrddrertacqakyjyflwqldqbvuuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266249.590727-1993-245285923870693/AnsiballZ_file.py'
Sep 30 21:04:09 compute-0 sudo[144406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:10 compute-0 podman[144408]: 2025-09-30 21:04:10.010989023 +0000 UTC m=+0.071042856 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:04:10 compute-0 python3.9[144409]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:10 compute-0 sudo[144406]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:10 compute-0 sudo[144578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evxyjgetqfjpqkoabvqbhdjsvokwrmjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266250.321883-1993-96606368125596/AnsiballZ_file.py'
Sep 30 21:04:10 compute-0 sudo[144578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:10 compute-0 python3.9[144580]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:10 compute-0 sudo[144578]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:11 compute-0 sudo[144730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aglcckxgnzfmhfkjykotmokipxipahwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266251.0206873-1993-240235570953025/AnsiballZ_file.py'
Sep 30 21:04:11 compute-0 sudo[144730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:11 compute-0 python3.9[144732]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:11 compute-0 sudo[144730]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:12 compute-0 sudo[144882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfghauzuspwzejoyweejezvmepbshzla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266251.7371736-1993-31778671271010/AnsiballZ_file.py'
Sep 30 21:04:12 compute-0 sudo[144882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:12 compute-0 python3.9[144884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:12 compute-0 sudo[144882]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:12 compute-0 sudo[145034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uctqtqndnspbrdhcaqitpgpeghcpzbyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266252.4311235-1993-54116691518662/AnsiballZ_file.py'
Sep 30 21:04:12 compute-0 sudo[145034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:12 compute-0 python3.9[145036]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:13 compute-0 sudo[145034]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:13 compute-0 sudo[145186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgtbbypqxvrgfabglpqjhegqvtwxdbcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266253.1757736-1993-65212469774547/AnsiballZ_file.py'
Sep 30 21:04:13 compute-0 sudo[145186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:13 compute-0 python3.9[145188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:13 compute-0 sudo[145186]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:14 compute-0 sudo[145338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhceecquikmnomwkrkuxhudzjgagngom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266253.8872726-1993-237338674702508/AnsiballZ_file.py'
Sep 30 21:04:14 compute-0 sudo[145338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:14 compute-0 python3.9[145340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:14 compute-0 sudo[145338]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:14 compute-0 sudo[145490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hldivumxkllkorbbwamdkftddutqerrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266254.6026545-1993-241653934891697/AnsiballZ_file.py'
Sep 30 21:04:14 compute-0 sudo[145490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:15 compute-0 python3.9[145492]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:15 compute-0 sudo[145490]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:15 compute-0 sudo[145642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvtpzqweezipqkfteduyqxaqyssdqaju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266255.258885-1993-56749109350164/AnsiballZ_file.py'
Sep 30 21:04:15 compute-0 sudo[145642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:15 compute-0 python3.9[145644]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:15 compute-0 sudo[145642]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:16 compute-0 sudo[145794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfqkxcjuklvoyzzihczzrnvvveuhxetb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266255.9587162-1993-98654463209865/AnsiballZ_file.py'
Sep 30 21:04:16 compute-0 sudo[145794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:16 compute-0 python3.9[145796]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:16 compute-0 sudo[145794]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:17 compute-0 sudo[145946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjgbvwbkkwdrirvpurqttoalxgpijnbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266256.954816-1993-231276932765239/AnsiballZ_file.py'
Sep 30 21:04:17 compute-0 sudo[145946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:17 compute-0 python3.9[145948]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:17 compute-0 sudo[145946]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:17 compute-0 sudo[146098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntzgxbdrmvbrqtyivnpdohrohytithex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266257.6322434-1993-211753228535823/AnsiballZ_file.py'
Sep 30 21:04:17 compute-0 sudo[146098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:18 compute-0 python3.9[146100]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:18 compute-0 sudo[146098]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:18 compute-0 sudo[146250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrehmrpelurmlvkiwfhjxlfmzkjtsvuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266258.2989948-1993-143573204230426/AnsiballZ_file.py'
Sep 30 21:04:18 compute-0 sudo[146250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:18 compute-0 python3.9[146252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:18 compute-0 sudo[146250]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:19 compute-0 sudo[146402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeysiwqemvmfmanenntrhtwuksbifpqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266259.4464571-2290-24448054805227/AnsiballZ_stat.py'
Sep 30 21:04:19 compute-0 sudo[146402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:19 compute-0 python3.9[146404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:19 compute-0 sudo[146402]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:20 compute-0 sudo[146525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-watqsxvkwweadthfzhajggzdmwrcxjom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266259.4464571-2290-24448054805227/AnsiballZ_copy.py'
Sep 30 21:04:20 compute-0 sudo[146525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:20 compute-0 python3.9[146527]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266259.4464571-2290-24448054805227/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:20 compute-0 sudo[146525]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:21 compute-0 sudo[146677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmmduotdntfwrrosvfwnuyxfukjaoknt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266260.7813065-2290-160592381824862/AnsiballZ_stat.py'
Sep 30 21:04:21 compute-0 sudo[146677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:21 compute-0 python3.9[146679]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:21 compute-0 sudo[146677]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:21 compute-0 sudo[146800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqotzdfuvbxvtstnucpvojlhinvleblr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266260.7813065-2290-160592381824862/AnsiballZ_copy.py'
Sep 30 21:04:21 compute-0 sudo[146800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:22 compute-0 python3.9[146802]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266260.7813065-2290-160592381824862/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:22 compute-0 sudo[146800]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:22 compute-0 sudo[146952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovnfxrxybygboatecqwbbdzrhjjkayg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266262.2298777-2290-39883689099251/AnsiballZ_stat.py'
Sep 30 21:04:22 compute-0 sudo[146952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:22 compute-0 python3.9[146954]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:22 compute-0 sudo[146952]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:23 compute-0 sudo[147075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfnuvxeoxfkckxgevvsvrjeupgmoemw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266262.2298777-2290-39883689099251/AnsiballZ_copy.py'
Sep 30 21:04:23 compute-0 sudo[147075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:23 compute-0 python3.9[147077]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266262.2298777-2290-39883689099251/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:23 compute-0 sudo[147075]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:24 compute-0 sudo[147227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtyscrkubhauexeelhkkbbziwfsmffzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266263.5921495-2290-98607642131809/AnsiballZ_stat.py'
Sep 30 21:04:24 compute-0 sudo[147227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:24 compute-0 python3.9[147229]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:24 compute-0 sudo[147227]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:24 compute-0 sudo[147350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxzqfzmoekvfkdxjviemzhpizvaivxtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266263.5921495-2290-98607642131809/AnsiballZ_copy.py'
Sep 30 21:04:24 compute-0 sudo[147350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:24 compute-0 python3.9[147352]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266263.5921495-2290-98607642131809/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:25 compute-0 sudo[147350]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:25 compute-0 sudo[147502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tadpnxxchrtkbarddhrjomeeakarhzsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266265.1663969-2290-50552906217287/AnsiballZ_stat.py'
Sep 30 21:04:25 compute-0 sudo[147502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:25 compute-0 python3.9[147504]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:25 compute-0 sudo[147502]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:26 compute-0 sudo[147625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toatjlvdwvjstfbpmelislphztnfxmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266265.1663969-2290-50552906217287/AnsiballZ_copy.py'
Sep 30 21:04:26 compute-0 sudo[147625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:26 compute-0 python3.9[147627]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266265.1663969-2290-50552906217287/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:26 compute-0 sudo[147625]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:26 compute-0 sudo[147777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vitanqlrmejhjuhjlrjkaskmdvkoevrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266266.4988072-2290-265108897039020/AnsiballZ_stat.py'
Sep 30 21:04:26 compute-0 sudo[147777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:27 compute-0 python3.9[147779]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:27 compute-0 sudo[147777]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:27 compute-0 sudo[147900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hheaanmvabehasthuzpbwwlnfmhoaong ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266266.4988072-2290-265108897039020/AnsiballZ_copy.py'
Sep 30 21:04:27 compute-0 sudo[147900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:27 compute-0 python3.9[147902]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266266.4988072-2290-265108897039020/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:27 compute-0 sudo[147900]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:28 compute-0 sudo[148052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsavogfmpmxjlhposgolwvgelbadimyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266267.822247-2290-278571650846918/AnsiballZ_stat.py'
Sep 30 21:04:28 compute-0 sudo[148052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:28 compute-0 python3.9[148054]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:28 compute-0 sudo[148052]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:28 compute-0 sudo[148175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgkifqorimzjhemwcimsqaeszobwcjeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266267.822247-2290-278571650846918/AnsiballZ_copy.py'
Sep 30 21:04:28 compute-0 sudo[148175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:28 compute-0 python3.9[148177]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266267.822247-2290-278571650846918/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:29 compute-0 sudo[148175]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:29 compute-0 sudo[148327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfbwxifyqlhphpuxhmtomikikxchflwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266269.1505427-2290-111201997581593/AnsiballZ_stat.py'
Sep 30 21:04:29 compute-0 sudo[148327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:29 compute-0 python3.9[148329]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:29 compute-0 sudo[148327]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:30 compute-0 sudo[148450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqfzfkxouzhrprnpuofytywnmeykrlaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266269.1505427-2290-111201997581593/AnsiballZ_copy.py'
Sep 30 21:04:30 compute-0 sudo[148450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:30 compute-0 python3.9[148452]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266269.1505427-2290-111201997581593/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:30 compute-0 sudo[148450]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:30 compute-0 sudo[148602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybsmwzgkgxdfhaykgfapgbcxwovwkial ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266270.4729984-2290-68559835784830/AnsiballZ_stat.py'
Sep 30 21:04:30 compute-0 sudo[148602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:31 compute-0 python3.9[148604]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:31 compute-0 sudo[148602]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:31 compute-0 sudo[148725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lysjjgolwyqingxxnvhvfhyktqbdbtdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266270.4729984-2290-68559835784830/AnsiballZ_copy.py'
Sep 30 21:04:31 compute-0 sudo[148725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:31 compute-0 python3.9[148727]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266270.4729984-2290-68559835784830/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:31 compute-0 sudo[148725]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:32 compute-0 sudo[148877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gssiqqkbpfmvefozrgjoqvqiehpnjmzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266271.9759657-2290-22141944214258/AnsiballZ_stat.py'
Sep 30 21:04:32 compute-0 sudo[148877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:32 compute-0 python3.9[148879]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:32 compute-0 sudo[148877]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:32 compute-0 sudo[149000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojjtnnkviilbpwjtcqdbzqbaofjkbqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266271.9759657-2290-22141944214258/AnsiballZ_copy.py'
Sep 30 21:04:32 compute-0 sudo[149000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:33 compute-0 python3.9[149002]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266271.9759657-2290-22141944214258/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:33 compute-0 sudo[149000]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:33 compute-0 sudo[149152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieewlsapnuxsljojrcvnpdxkfqxahups ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266273.2978501-2290-133038684002808/AnsiballZ_stat.py'
Sep 30 21:04:33 compute-0 sudo[149152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:33 compute-0 python3.9[149154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:33 compute-0 sudo[149152]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:34 compute-0 sudo[149275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyhaktungvypqhplodgulhxjlxppxuif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266273.2978501-2290-133038684002808/AnsiballZ_copy.py'
Sep 30 21:04:34 compute-0 sudo[149275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:34 compute-0 python3.9[149277]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266273.2978501-2290-133038684002808/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:34 compute-0 sudo[149275]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:34 compute-0 sudo[149427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnwsripmjrwvpkrmwaldulwppdgdrwxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266274.5505729-2290-234164607128239/AnsiballZ_stat.py'
Sep 30 21:04:34 compute-0 sudo[149427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:35 compute-0 python3.9[149429]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:35 compute-0 sudo[149427]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:35 compute-0 sudo[149550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbvjgexlcczpozsfbaryrzbemgaeccck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266274.5505729-2290-234164607128239/AnsiballZ_copy.py'
Sep 30 21:04:35 compute-0 sudo[149550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:35 compute-0 python3.9[149552]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266274.5505729-2290-234164607128239/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:35 compute-0 sudo[149550]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:36 compute-0 sudo[149702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bywsfaimblylpekskdsxzstcxztzzvod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266276.0174756-2290-125901098997877/AnsiballZ_stat.py'
Sep 30 21:04:36 compute-0 sudo[149702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:36 compute-0 python3.9[149704]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:36 compute-0 sudo[149702]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:37 compute-0 sudo[149842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnycgriuwjiuicrrrjflgxcbiajvtpfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266276.0174756-2290-125901098997877/AnsiballZ_copy.py'
Sep 30 21:04:37 compute-0 sudo[149842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:37 compute-0 podman[149799]: 2025-09-30 21:04:37.233374082 +0000 UTC m=+0.112742913 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller)
Sep 30 21:04:37 compute-0 python3.9[149846]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266276.0174756-2290-125901098997877/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:37 compute-0 sudo[149842]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:37 compute-0 sudo[150001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-codawtmtelurntkmafprmzatxzeyhxru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266277.5980082-2290-108392787833130/AnsiballZ_stat.py'
Sep 30 21:04:37 compute-0 sudo[150001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:38 compute-0 python3.9[150003]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:04:38 compute-0 sudo[150001]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:38 compute-0 sudo[150124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pezsjhbnvxmaxkpgdgblxnugxqqrsrlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266277.5980082-2290-108392787833130/AnsiballZ_copy.py'
Sep 30 21:04:38 compute-0 sudo[150124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:04:38.704 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:04:38.704 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:04:38.704 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:04:38 compute-0 python3.9[150126]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266277.5980082-2290-108392787833130/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:38 compute-0 sudo[150124]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:40 compute-0 python3.9[150276]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:04:40 compute-0 podman[150280]: 2025-09-30 21:04:40.348648984 +0000 UTC m=+0.080508541 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:04:41 compute-0 sudo[150448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxhcleznerhpwmvxchsyhwklwanodgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266280.5146542-2908-192914405821508/AnsiballZ_seboolean.py'
Sep 30 21:04:41 compute-0 sudo[150448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:41 compute-0 python3.9[150450]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Sep 30 21:04:42 compute-0 sudo[150448]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:43 compute-0 sudo[150604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqajddpruhkcfusvjzthlszibdisxcxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266282.7542117-2932-43683319002739/AnsiballZ_copy.py'
Sep 30 21:04:43 compute-0 dbus-broker-launch[778]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Sep 30 21:04:43 compute-0 sudo[150604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:43 compute-0 python3.9[150606]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:43 compute-0 sudo[150604]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:43 compute-0 sudo[150756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaiunaqlvrrmxzeenjrswrujydwkkndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266283.46107-2932-106827629676747/AnsiballZ_copy.py'
Sep 30 21:04:43 compute-0 sudo[150756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:43 compute-0 python3.9[150758]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:43 compute-0 sudo[150756]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:44 compute-0 sudo[150908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yskzasxbhneiwrddbhkqorxkjxpqkfmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266284.1537707-2932-220859220806083/AnsiballZ_copy.py'
Sep 30 21:04:44 compute-0 sudo[150908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:44 compute-0 python3.9[150910]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:44 compute-0 sudo[150908]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:45 compute-0 sudo[151060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwrkqbyllheqmiybjgftzocouyxexcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266284.8523366-2932-145279053055774/AnsiballZ_copy.py'
Sep 30 21:04:45 compute-0 sudo[151060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:45 compute-0 python3.9[151062]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:45 compute-0 sudo[151060]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:45 compute-0 sudo[151212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgskivmnqpcaiyhowgpwiedbxhqfshkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266285.6415715-2932-94259441784905/AnsiballZ_copy.py'
Sep 30 21:04:45 compute-0 sudo[151212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:46 compute-0 python3.9[151214]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:46 compute-0 sudo[151212]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:46 compute-0 sudo[151364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtxqwbusbnvlkjnfymmlkotkgehatbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266286.5600767-3040-12397559703419/AnsiballZ_copy.py'
Sep 30 21:04:46 compute-0 sudo[151364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:47 compute-0 python3.9[151366]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:47 compute-0 sudo[151364]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:47 compute-0 sudo[151516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-higzjjomwchwopuezvljlcdumktpukzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266287.360249-3040-196417393595411/AnsiballZ_copy.py'
Sep 30 21:04:47 compute-0 sudo[151516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:47 compute-0 python3.9[151518]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:47 compute-0 sudo[151516]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:48 compute-0 sudo[151668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdxllxuuiaqohhuacdqcwbugmbcxjcuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266288.1058552-3040-90337307702185/AnsiballZ_copy.py'
Sep 30 21:04:48 compute-0 sudo[151668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:48 compute-0 python3.9[151670]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:48 compute-0 sudo[151668]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:49 compute-0 sudo[151820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzbrpvusbhyeuznxcigwhakyitrpkfjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266288.8928323-3040-205395430876363/AnsiballZ_copy.py'
Sep 30 21:04:49 compute-0 sudo[151820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:49 compute-0 python3.9[151822]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:49 compute-0 sudo[151820]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:49 compute-0 sudo[151972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hajkxvzbwfuivilfqfnuyrgvnbvajfsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266289.5396833-3040-6538314097166/AnsiballZ_copy.py'
Sep 30 21:04:49 compute-0 sudo[151972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:50 compute-0 python3.9[151974]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:50 compute-0 sudo[151972]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:50 compute-0 sudo[152124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbvpruqfunxrdslnvvydfotdgpmzlcen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266290.3837976-3148-224517944095986/AnsiballZ_systemd.py'
Sep 30 21:04:50 compute-0 sudo[152124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:51 compute-0 python3.9[152126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:51 compute-0 systemd[1]: Reloading.
Sep 30 21:04:51 compute-0 systemd-sysv-generator[152157]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:51 compute-0 systemd-rc-local-generator[152154]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:51 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Sep 30 21:04:51 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Sep 30 21:04:51 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Sep 30 21:04:51 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Sep 30 21:04:51 compute-0 systemd[1]: Starting libvirt logging daemon...
Sep 30 21:04:51 compute-0 systemd[1]: Started libvirt logging daemon.
Sep 30 21:04:51 compute-0 sudo[152124]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:52 compute-0 sudo[152317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzuklklpclefsabcykhnhziyictejdta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266291.775873-3148-127461056099673/AnsiballZ_systemd.py'
Sep 30 21:04:52 compute-0 sudo[152317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:52 compute-0 python3.9[152319]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:52 compute-0 systemd[1]: Reloading.
Sep 30 21:04:52 compute-0 systemd-rc-local-generator[152342]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:52 compute-0 systemd-sysv-generator[152348]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:52 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Sep 30 21:04:52 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Sep 30 21:04:52 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Sep 30 21:04:52 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Sep 30 21:04:52 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Sep 30 21:04:52 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Sep 30 21:04:52 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 21:04:52 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 21:04:52 compute-0 sudo[152317]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:53 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Sep 30 21:04:53 compute-0 sudo[152532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eipzmlbixvosznmerfainfmypucxxpun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266292.990763-3148-138915088557480/AnsiballZ_systemd.py'
Sep 30 21:04:53 compute-0 sudo[152532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:53 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Sep 30 21:04:53 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Sep 30 21:04:53 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Sep 30 21:04:53 compute-0 python3.9[152534]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:53 compute-0 systemd[1]: Reloading.
Sep 30 21:04:53 compute-0 systemd-rc-local-generator[152566]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:53 compute-0 systemd-sysv-generator[152571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:53 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Sep 30 21:04:53 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Sep 30 21:04:53 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Sep 30 21:04:53 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Sep 30 21:04:53 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 21:04:54 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 21:04:54 compute-0 sudo[152532]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:54 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f58d3e6f-4406-4665-a496-d971d041805e
Sep 30 21:04:54 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 21:04:54 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f58d3e6f-4406-4665-a496-d971d041805e
Sep 30 21:04:54 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:04:54 compute-0 sudo[152751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmhpejpfvcdugtgjshmqkttxoosdaaky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266294.2275753-3148-262839991020436/AnsiballZ_systemd.py'
Sep 30 21:04:54 compute-0 sudo[152751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:54 compute-0 setroubleshoot[152432]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 21:04:54 compute-0 python3.9[152753]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:54 compute-0 systemd[1]: Reloading.
Sep 30 21:04:55 compute-0 systemd-sysv-generator[152783]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:55 compute-0 systemd-rc-local-generator[152780]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:55 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Sep 30 21:04:55 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Sep 30 21:04:55 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Sep 30 21:04:55 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Sep 30 21:04:55 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Sep 30 21:04:55 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Sep 30 21:04:55 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Sep 30 21:04:55 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Sep 30 21:04:55 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Sep 30 21:04:55 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Sep 30 21:04:55 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 21:04:55 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 21:04:55 compute-0 sudo[152751]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:55 compute-0 sudo[152965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdxjgihxtheugxjibtoggcqtendkedob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266295.5588531-3148-47687168118296/AnsiballZ_systemd.py'
Sep 30 21:04:55 compute-0 sudo[152965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:56 compute-0 python3.9[152967]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:04:56 compute-0 systemd[1]: Reloading.
Sep 30 21:04:56 compute-0 systemd-rc-local-generator[152992]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:04:56 compute-0 systemd-sysv-generator[152998]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:04:56 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Sep 30 21:04:56 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Sep 30 21:04:56 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Sep 30 21:04:56 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Sep 30 21:04:56 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Sep 30 21:04:56 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Sep 30 21:04:56 compute-0 systemd[1]: Starting libvirt secret daemon...
Sep 30 21:04:56 compute-0 systemd[1]: Started libvirt secret daemon.
Sep 30 21:04:56 compute-0 sudo[152965]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:58 compute-0 sudo[153175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhzecwrfnhgsxxfrkprgtyignkxpzgyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266297.951235-3259-147168677113204/AnsiballZ_file.py'
Sep 30 21:04:58 compute-0 sudo[153175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:58 compute-0 python3.9[153177]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:04:58 compute-0 sudo[153175]: pam_unix(sudo:session): session closed for user root
Sep 30 21:04:59 compute-0 sudo[153327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugjvpdthjodaprohkndbyajiywvtolro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266298.8334568-3283-188077126281498/AnsiballZ_find.py'
Sep 30 21:04:59 compute-0 sudo[153327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:04:59 compute-0 python3.9[153329]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:04:59 compute-0 sudo[153327]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:00 compute-0 sudo[153479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pddqzhdycrdijvaowyvvirhlpckjxxzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266300.005376-3325-188339984441323/AnsiballZ_stat.py'
Sep 30 21:05:00 compute-0 sudo[153479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:00 compute-0 python3.9[153481]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:00 compute-0 sudo[153479]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:00 compute-0 sudo[153602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uixpjykgbcrtfbzkwwirrtofkdpiyvgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266300.005376-3325-188339984441323/AnsiballZ_copy.py'
Sep 30 21:05:00 compute-0 sudo[153602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:01 compute-0 python3.9[153604]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266300.005376-3325-188339984441323/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:01 compute-0 sudo[153602]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:01 compute-0 sudo[153754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucqwurcowjzfhkjvlermrcvmmblqroeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266301.6342142-3373-149932498506450/AnsiballZ_file.py'
Sep 30 21:05:01 compute-0 sudo[153754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:02 compute-0 python3.9[153756]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:02 compute-0 sudo[153754]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:02 compute-0 sudo[153906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gghcrhsvqknywqfpzfgmtxozxzxafsdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266302.641956-3397-214300633060820/AnsiballZ_stat.py'
Sep 30 21:05:02 compute-0 sudo[153906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:03 compute-0 python3.9[153908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:03 compute-0 sudo[153906]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:03 compute-0 sudo[153984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcinffouzmvobggvsbipooqwxavkwopd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266302.641956-3397-214300633060820/AnsiballZ_file.py'
Sep 30 21:05:03 compute-0 sudo[153984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:03 compute-0 python3.9[153986]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:03 compute-0 sudo[153984]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:04 compute-0 sudo[154136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyznsjhjlddoogstmcoxshvyhwrlrrtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266304.068282-3433-14809851414775/AnsiballZ_stat.py'
Sep 30 21:05:04 compute-0 sudo[154136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:04 compute-0 python3.9[154138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:04 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Sep 30 21:05:04 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.042s CPU time.
Sep 30 21:05:04 compute-0 sudo[154136]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:04 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Sep 30 21:05:04 compute-0 sudo[154214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwpvamknwlcxyzhivgbwxhulyfruuisr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266304.068282-3433-14809851414775/AnsiballZ_file.py'
Sep 30 21:05:04 compute-0 sudo[154214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:05 compute-0 python3.9[154216]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.d1xw2d20 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:05 compute-0 sudo[154214]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:05 compute-0 sudo[154366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msdfaedonlrkbytwswwgztqndvadtaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266305.448174-3469-171648159529752/AnsiballZ_stat.py'
Sep 30 21:05:05 compute-0 sudo[154366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:05 compute-0 python3.9[154368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:05 compute-0 sudo[154366]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:06 compute-0 sudo[154444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmfezbqihrfbajwchqcjlzuoubootbpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266305.448174-3469-171648159529752/AnsiballZ_file.py'
Sep 30 21:05:06 compute-0 sudo[154444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:06 compute-0 python3.9[154446]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:06 compute-0 sudo[154444]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:07 compute-0 sudo[154611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koymkrvmsnxmpsnhjdtcnyyhqvthqwdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266306.951064-3508-115546499878350/AnsiballZ_command.py'
Sep 30 21:05:07 compute-0 sudo[154611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:07 compute-0 podman[154570]: 2025-09-30 21:05:07.505684303 +0000 UTC m=+0.118624594 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:05:07 compute-0 python3.9[154617]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:07 compute-0 sudo[154611]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:08 compute-0 sudo[154775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmujqaprsnbxbfevnsuvtesbiwccagcb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266307.8877017-3532-145024786013431/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 21:05:08 compute-0 sudo[154775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:08 compute-0 python3[154777]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 21:05:08 compute-0 sudo[154775]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:09 compute-0 sudo[154927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjlqqeomuaffsvyygljollxhpbhyxal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266308.8809156-3556-177908849392223/AnsiballZ_stat.py'
Sep 30 21:05:09 compute-0 sudo[154927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:09 compute-0 python3.9[154929]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:09 compute-0 sudo[154927]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:09 compute-0 sudo[155005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bahqlmbamdftjeviajvmyomunkqrgxks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266308.8809156-3556-177908849392223/AnsiballZ_file.py'
Sep 30 21:05:09 compute-0 sudo[155005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:10 compute-0 python3.9[155007]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:10 compute-0 sudo[155005]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:10 compute-0 sudo[155172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyipgwjunxpcnygefdxnauuyioctzblk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266310.3255162-3592-130587189866786/AnsiballZ_stat.py'
Sep 30 21:05:10 compute-0 sudo[155172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:10 compute-0 podman[155131]: 2025-09-30 21:05:10.678465414 +0000 UTC m=+0.057017895 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:05:10 compute-0 python3.9[155178]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:10 compute-0 sudo[155172]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:11 compute-0 sudo[155254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptzyuftbelkfhfahgbgknmthfjgrqkjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266310.3255162-3592-130587189866786/AnsiballZ_file.py'
Sep 30 21:05:11 compute-0 sudo[155254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:11 compute-0 python3.9[155256]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:11 compute-0 sudo[155254]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:12 compute-0 sudo[155406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyryvusctxxppxlsqbxuddbzzwrayeaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266311.6776383-3628-104681581246514/AnsiballZ_stat.py'
Sep 30 21:05:12 compute-0 sudo[155406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:12 compute-0 python3.9[155408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:12 compute-0 sudo[155406]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:12 compute-0 sudo[155484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrtpchcfufkphmkxmiilqijavdsavxgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266311.6776383-3628-104681581246514/AnsiballZ_file.py'
Sep 30 21:05:12 compute-0 sudo[155484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:12 compute-0 python3.9[155486]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:12 compute-0 sudo[155484]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:13 compute-0 sudo[155636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhplrwrmqwwsfjhfrfbtmwcjzjkxmylu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266313.1034613-3664-274752311239776/AnsiballZ_stat.py'
Sep 30 21:05:13 compute-0 sudo[155636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:13 compute-0 python3.9[155638]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:13 compute-0 sudo[155636]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:13 compute-0 sudo[155714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rslykhojbreerjosifhmmofsvahqdckp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266313.1034613-3664-274752311239776/AnsiballZ_file.py'
Sep 30 21:05:13 compute-0 sudo[155714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:14 compute-0 python3.9[155716]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:14 compute-0 sudo[155714]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:14 compute-0 sudo[155867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqfqdekqptgbgmafvqodlfvqwehtzwbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266314.3567953-3700-151350891784919/AnsiballZ_stat.py'
Sep 30 21:05:14 compute-0 sudo[155867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:15 compute-0 python3.9[155869]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:15 compute-0 sudo[155867]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:15 compute-0 sudo[155993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjlvsnbfhvkrmuvtwfnilhfiahjaodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266314.3567953-3700-151350891784919/AnsiballZ_copy.py'
Sep 30 21:05:15 compute-0 sudo[155993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:15 compute-0 python3.9[155995]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266314.3567953-3700-151350891784919/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:15 compute-0 sudo[155993]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:16 compute-0 sudo[156145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgwixqqwfkgfbxtxzksewvzqmlcatchp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266315.9143007-3745-219142325072522/AnsiballZ_file.py'
Sep 30 21:05:16 compute-0 sudo[156145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:16 compute-0 python3.9[156147]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:16 compute-0 sudo[156145]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:17 compute-0 sudo[156297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osuidtpeuqnsgwwjfkqsesjndttpgazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266316.7920501-3769-222622240545215/AnsiballZ_command.py'
Sep 30 21:05:17 compute-0 sudo[156297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:17 compute-0 python3.9[156299]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:17 compute-0 sudo[156297]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:18 compute-0 sudo[156452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwurfqgksztmymfcalxtqzndlbmxjjjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266317.659689-3793-22056459743323/AnsiballZ_blockinfile.py'
Sep 30 21:05:18 compute-0 sudo[156452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:18 compute-0 python3.9[156454]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:18 compute-0 sudo[156452]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:19 compute-0 sudo[156604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbjfrvgtlnbzyofbqffudubiddjrhoxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266318.8439794-3820-104336509539211/AnsiballZ_command.py'
Sep 30 21:05:19 compute-0 sudo[156604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:19 compute-0 python3.9[156606]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:19 compute-0 sudo[156604]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:20 compute-0 sudo[156757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckezdlkjwwvqvdzrpfbxxlgaqartiegc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266319.7272327-3844-279580829552999/AnsiballZ_stat.py'
Sep 30 21:05:20 compute-0 sudo[156757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:20 compute-0 python3.9[156759]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:20 compute-0 sudo[156757]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:20 compute-0 sudo[156911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atrfnrubzxploxrfifjsmdxlbqedynrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266320.5283983-3868-4492364815004/AnsiballZ_command.py'
Sep 30 21:05:20 compute-0 sudo[156911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:21 compute-0 python3.9[156913]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:05:21 compute-0 sudo[156911]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:21 compute-0 sudo[157066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dttmlhurohpxjiwhotumjmsudjicryax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266321.39339-3892-200979172486330/AnsiballZ_file.py'
Sep 30 21:05:21 compute-0 sudo[157066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:21 compute-0 python3.9[157068]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:21 compute-0 sudo[157066]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:22 compute-0 sudo[157218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuawfxhflwypfbzgtnfinhkbxvynxcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266322.231818-3916-32881812822385/AnsiballZ_stat.py'
Sep 30 21:05:22 compute-0 sudo[157218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:22 compute-0 python3.9[157220]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:22 compute-0 sudo[157218]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:23 compute-0 sudo[157341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etdvtxwlifowxujpmgjcinswsxtpwvkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266322.231818-3916-32881812822385/AnsiballZ_copy.py'
Sep 30 21:05:23 compute-0 sudo[157341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:23 compute-0 python3.9[157343]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266322.231818-3916-32881812822385/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:23 compute-0 sudo[157341]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:24 compute-0 sudo[157493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdgkigkzveyveyixzwzbgqyfjutaohgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266323.7827964-3961-164327058725755/AnsiballZ_stat.py'
Sep 30 21:05:24 compute-0 sudo[157493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:24 compute-0 python3.9[157495]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:24 compute-0 sudo[157493]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:24 compute-0 sudo[157616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbtthpglxfegziitbweocnginqrkwepz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266323.7827964-3961-164327058725755/AnsiballZ_copy.py'
Sep 30 21:05:24 compute-0 sudo[157616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:25 compute-0 python3.9[157618]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266323.7827964-3961-164327058725755/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:25 compute-0 sudo[157616]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:25 compute-0 sudo[157768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfhvphbtngjpjkciqmsrluzaxzskackc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266325.3276713-4006-116853168482706/AnsiballZ_stat.py'
Sep 30 21:05:25 compute-0 sudo[157768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:25 compute-0 python3.9[157770]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:25 compute-0 sudo[157768]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:26 compute-0 sudo[157891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ietcyslblxdxdjacthqzqncthxevkdet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266325.3276713-4006-116853168482706/AnsiballZ_copy.py'
Sep 30 21:05:26 compute-0 sudo[157891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:26 compute-0 python3.9[157893]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266325.3276713-4006-116853168482706/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:26 compute-0 sudo[157891]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:27 compute-0 sudo[158043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwepglyaebulufpkqxmtarqvqjgepec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266326.8619514-4051-3535323754516/AnsiballZ_systemd.py'
Sep 30 21:05:27 compute-0 sudo[158043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:27 compute-0 python3.9[158045]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:27 compute-0 systemd[1]: Reloading.
Sep 30 21:05:27 compute-0 systemd-rc-local-generator[158073]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:27 compute-0 systemd-sysv-generator[158076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:27 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Sep 30 21:05:27 compute-0 sudo[158043]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:28 compute-0 sudo[158234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjogtjbwoyciwovbcnzxaeaslolnpkau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266328.1956508-4075-151051927545564/AnsiballZ_systemd.py'
Sep 30 21:05:28 compute-0 sudo[158234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:28 compute-0 python3.9[158236]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 21:05:28 compute-0 systemd[1]: Reloading.
Sep 30 21:05:28 compute-0 systemd-rc-local-generator[158265]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:28 compute-0 systemd-sysv-generator[158269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:29 compute-0 systemd[1]: Reloading.
Sep 30 21:05:29 compute-0 systemd-sysv-generator[158304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:29 compute-0 systemd-rc-local-generator[158299]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:29 compute-0 sudo[158234]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:29 compute-0 sshd-session[103988]: Connection closed by 192.168.122.30 port 35884
Sep 30 21:05:29 compute-0 sshd-session[103985]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:05:29 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Sep 30 21:05:29 compute-0 systemd[1]: session-23.scope: Consumed 3min 38.566s CPU time.
Sep 30 21:05:29 compute-0 systemd-logind[792]: Session 23 logged out. Waiting for processes to exit.
Sep 30 21:05:29 compute-0 systemd-logind[792]: Removed session 23.
Sep 30 21:05:35 compute-0 sshd-session[158334]: Accepted publickey for zuul from 192.168.122.30 port 49238 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:05:35 compute-0 systemd-logind[792]: New session 24 of user zuul.
Sep 30 21:05:35 compute-0 systemd[1]: Started Session 24 of User zuul.
Sep 30 21:05:35 compute-0 sshd-session[158334]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:05:36 compute-0 python3.9[158487]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:05:37 compute-0 sudo[158658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewdojvvjpdmrtghxhrgyspmdjdjogxpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266337.457977-67-48028249997939/AnsiballZ_file.py'
Sep 30 21:05:37 compute-0 sudo[158658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:37 compute-0 podman[158615]: 2025-09-30 21:05:37.990138552 +0000 UTC m=+0.116997229 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:05:38 compute-0 python3.9[158669]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:38 compute-0 sudo[158658]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:38 compute-0 sudo[158819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpqkcgripvwlcmxiwxybabbdrmxbofwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266338.3115323-67-153526850987523/AnsiballZ_file.py'
Sep 30 21:05:38 compute-0 sudo[158819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:05:38.705 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:05:38.706 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:05:38.706 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:05:38 compute-0 python3.9[158821]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:38 compute-0 sudo[158819]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:39 compute-0 sudo[158971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhvkgnoswanxqplueljelxyxayzsyruw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266339.0098464-67-229081149041389/AnsiballZ_file.py'
Sep 30 21:05:39 compute-0 sudo[158971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:39 compute-0 python3.9[158973]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:39 compute-0 sudo[158971]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:39 compute-0 sudo[159123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwtkjfizwtqjpzrbuwfnuivaxvugoncv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266339.6361518-67-150491014342769/AnsiballZ_file.py'
Sep 30 21:05:39 compute-0 sudo[159123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:40 compute-0 python3.9[159125]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:05:40 compute-0 sudo[159123]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:40 compute-0 sudo[159275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eawmksgporenkpalwzsbhwyoofrwxbtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266340.370809-67-108810082656529/AnsiballZ_file.py'
Sep 30 21:05:40 compute-0 sudo[159275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:40 compute-0 python3.9[159277]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:40 compute-0 sudo[159275]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:41 compute-0 podman[159302]: 2025-09-30 21:05:41.341800959 +0000 UTC m=+0.071962240 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Sep 30 21:05:42 compute-0 sudo[159446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eplvxicusrcllummaaqzqctzzdndafjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266341.5651355-175-24735069518161/AnsiballZ_stat.py'
Sep 30 21:05:42 compute-0 sudo[159446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:42 compute-0 python3.9[159448]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:42 compute-0 sudo[159446]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:43 compute-0 sudo[159600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywzdfueceqimaytymgtrhctpmhbohxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266342.5871656-199-12713008822108/AnsiballZ_systemd.py'
Sep 30 21:05:43 compute-0 sudo[159600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:43 compute-0 python3.9[159602]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:43 compute-0 systemd[1]: Reloading.
Sep 30 21:05:43 compute-0 systemd-sysv-generator[159635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:43 compute-0 systemd-rc-local-generator[159632]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:43 compute-0 sudo[159600]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:44 compute-0 sudo[159788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vldhqfwqswndsbcmhbikcbkgugkysmtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266344.1379268-223-192535913031040/AnsiballZ_service_facts.py'
Sep 30 21:05:44 compute-0 sudo[159788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:44 compute-0 python3.9[159790]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:05:44 compute-0 network[159807]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:05:44 compute-0 network[159808]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:05:44 compute-0 network[159809]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:05:48 compute-0 sudo[159788]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:50 compute-0 sudo[160080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjqapqbgwelhireeljboixkcpkpffzse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266350.1180341-247-38705061521200/AnsiballZ_systemd.py'
Sep 30 21:05:50 compute-0 sudo[160080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:50 compute-0 python3.9[160082]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:05:50 compute-0 systemd[1]: Reloading.
Sep 30 21:05:50 compute-0 systemd-sysv-generator[160114]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:05:50 compute-0 systemd-rc-local-generator[160109]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:05:51 compute-0 sudo[160080]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:51 compute-0 python3.9[160270]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:52 compute-0 sudo[160420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvhyysegxqdnybwjcnwvvpdqxabjwkeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266352.205968-298-178844957507623/AnsiballZ_podman_container.py'
Sep 30 21:05:52 compute-0 sudo[160420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:53 compute-0 python3.9[160422]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.307346741 +0000 UTC m=+0.067474681 container create 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:05:53 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:53 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3420] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-0 kernel: veth0: entered allmulticast mode
Sep 30 21:05:53 compute-0 kernel: veth0: entered promiscuous mode
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3576] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3601] device (veth0): carrier: link connected
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3607] device (podman0): carrier: link connected
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.276821944 +0000 UTC m=+0.036949924 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:05:53 compute-0 systemd-udevd[160487]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:05:53 compute-0 systemd-udevd[160484]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3879] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3895] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3912] device (podman0): Activation: starting connection 'podman0' (5c494a0e-ef74-496a-9917-96fdf22ac7df)
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3916] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3925] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3930] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.3936] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 21:05:53 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.4312] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.4314] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.4321] device (podman0): Activation: successful, device activated.
Sep 30 21:05:53 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Sep 30 21:05:53 compute-0 systemd[1]: Started libpod-conmon-613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff.scope.
Sep 30 21:05:53 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.713654841 +0000 UTC m=+0.473782781 container init 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.728482622 +0000 UTC m=+0.488610542 container start 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:05:53 compute-0 iscsid_config[160617]: iqn.1994-05.com.redhat:8e6eb61820dd
Sep 30 21:05:53 compute-0 systemd[1]: libpod-613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff.scope: Deactivated successfully.
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.735293679 +0000 UTC m=+0.495421709 container attach 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:05:53 compute-0 podman[160459]: 2025-09-30 21:05:53.73670336 +0000 UTC m=+0.496831260 container died 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Sep 30 21:05:53 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Sep 30 21:05:53 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 21:05:53 compute-0 NetworkManager[51733]: <info>  [1759266353.7997] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:05:54 compute-0 systemd[1]: run-netns-netns\x2dd01aa6c7\x2dfde1\x2df4a7\x2dbd6a\x2d4ea6b2bd49f6.mount: Deactivated successfully.
Sep 30 21:05:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-5d6ad907fd6942a6e245aa4a60ceed85901e8a3d535eb7b68f4ca878b7746a8d-merged.mount: Deactivated successfully.
Sep 30 21:05:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff-userdata-shm.mount: Deactivated successfully.
Sep 30 21:05:54 compute-0 podman[160459]: 2025-09-30 21:05:54.146634496 +0000 UTC m=+0.906762416 container remove 613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:05:54 compute-0 python3.9[160422]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Sep 30 21:05:54 compute-0 systemd[1]: libpod-conmon-613a66c900e909356e5c0b257e9314d795ca98144e7a5f49971663863ccf61ff.scope: Deactivated successfully.
Sep 30 21:05:54 compute-0 python3.9[160422]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Sep 30 21:05:54 compute-0 sudo[160420]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:54 compute-0 sudo[160863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alpqorqjgziywdbswjojunrajwhtxvmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266354.5887818-322-189449342866357/AnsiballZ_stat.py'
Sep 30 21:05:54 compute-0 sudo[160863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:55 compute-0 python3.9[160865]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:05:55 compute-0 sudo[160863]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:55 compute-0 sudo[160986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkmtgruhjmiftwzbshupggxyyysozvve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266354.5887818-322-189449342866357/AnsiballZ_copy.py'
Sep 30 21:05:55 compute-0 sudo[160986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:55 compute-0 python3.9[160988]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266354.5887818-322-189449342866357/.source.iscsi _original_basename=.itg3_0l_ follow=False checksum=feed3ddc3b6d3cb0ec4ae1bbcec3a3a4307be68a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:56 compute-0 sudo[160986]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:56 compute-0 sudo[161138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjpgqvpkvkarwuiztmxoyfkefysbfoks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266356.175412-367-153891075631893/AnsiballZ_file.py'
Sep 30 21:05:56 compute-0 sudo[161138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:56 compute-0 python3.9[161140]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:56 compute-0 sudo[161138]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:57 compute-0 python3.9[161290]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:05:58 compute-0 sudo[161442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylmzsdinecolsyzcdsvgfufsuudzgmtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266357.7665656-418-237114387327907/AnsiballZ_lineinfile.py'
Sep 30 21:05:58 compute-0 sudo[161442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:58 compute-0 python3.9[161444]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:05:58 compute-0 sudo[161442]: pam_unix(sudo:session): session closed for user root
Sep 30 21:05:59 compute-0 sudo[161594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irinrlxtznrrobmkrywltlkodummhqjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266358.8933978-445-164302885540393/AnsiballZ_file.py'
Sep 30 21:05:59 compute-0 sudo[161594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:05:59 compute-0 python3.9[161596]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:05:59 compute-0 sudo[161594]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:00 compute-0 sudo[161746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafuobrhwibcvtidpbzbuaikeyjfdixl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266359.696481-469-61309560854989/AnsiballZ_stat.py'
Sep 30 21:06:00 compute-0 sudo[161746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:00 compute-0 python3.9[161748]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:00 compute-0 sudo[161746]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:00 compute-0 sudo[161824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxpyqnhzwannwcondojygsfklpzngfru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266359.696481-469-61309560854989/AnsiballZ_file.py'
Sep 30 21:06:00 compute-0 sudo[161824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:00 compute-0 python3.9[161826]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:00 compute-0 sudo[161824]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:01 compute-0 sudo[161976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbftbxzfeopaimllzeopcjgmywwdkqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266360.886347-469-32210324479443/AnsiballZ_stat.py'
Sep 30 21:06:01 compute-0 sudo[161976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:01 compute-0 python3.9[161978]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:01 compute-0 sudo[161976]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:01 compute-0 sudo[162054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxnnvxwbmymiuihzzgdpearmbnnisad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266360.886347-469-32210324479443/AnsiballZ_file.py'
Sep 30 21:06:01 compute-0 sudo[162054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:01 compute-0 python3.9[162056]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:01 compute-0 sudo[162054]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:02 compute-0 sudo[162206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjbpezyxxuzbjpniadojppthbgzaursh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266362.2620065-538-264121449511359/AnsiballZ_file.py'
Sep 30 21:06:02 compute-0 sudo[162206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:02 compute-0 python3.9[162208]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:02 compute-0 sudo[162206]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:03 compute-0 sudo[162358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yokakefgavwkfupoxdaqfjgyfewigwzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266363.0541515-562-92822504695443/AnsiballZ_stat.py'
Sep 30 21:06:03 compute-0 sudo[162358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:03 compute-0 python3.9[162360]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:03 compute-0 sudo[162358]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:03 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 21:06:03 compute-0 sudo[162436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gglqzktxyhaqzqknsfqitwgyjqondttn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266363.0541515-562-92822504695443/AnsiballZ_file.py'
Sep 30 21:06:03 compute-0 sudo[162436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:04 compute-0 python3.9[162438]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:04 compute-0 sudo[162436]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:04 compute-0 sudo[162588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znkwohvqxugmrenswtmzddswvfybgohe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266364.4622242-598-88157875735897/AnsiballZ_stat.py'
Sep 30 21:06:04 compute-0 sudo[162588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:05 compute-0 python3.9[162590]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:05 compute-0 sudo[162588]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:05 compute-0 sudo[162666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imrktdwcdrkqhbrlxsmnhdoyiatnjaly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266364.4622242-598-88157875735897/AnsiballZ_file.py'
Sep 30 21:06:05 compute-0 sudo[162666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:05 compute-0 python3.9[162668]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:05 compute-0 sudo[162666]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:06 compute-0 sudo[162818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntrpahlsdaajckvyyhkozsbonqddiphj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266365.8254638-634-117736974719318/AnsiballZ_systemd.py'
Sep 30 21:06:06 compute-0 sudo[162818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:06 compute-0 python3.9[162820]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:06 compute-0 systemd[1]: Reloading.
Sep 30 21:06:06 compute-0 systemd-sysv-generator[162848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:06 compute-0 systemd-rc-local-generator[162841]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:06 compute-0 sudo[162818]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:07 compute-0 sudo[163006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrydmlsdpszxjbrhpofsjbaqtnrjpwmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266367.1894655-658-92437086713901/AnsiballZ_stat.py'
Sep 30 21:06:07 compute-0 sudo[163006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:07 compute-0 python3.9[163008]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:07 compute-0 sudo[163006]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:08 compute-0 sudo[163084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kawbmxnojgssfixkzgekuringxoejdli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266367.1894655-658-92437086713901/AnsiballZ_file.py'
Sep 30 21:06:08 compute-0 sudo[163084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:08 compute-0 podman[163086]: 2025-09-30 21:06:08.197066896 +0000 UTC m=+0.132361155 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller)
Sep 30 21:06:08 compute-0 python3.9[163087]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:08 compute-0 sudo[163084]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:08 compute-0 sudo[163263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itkxwcgflhyjdeeszxizfszrrzhoahlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266368.5151234-694-43176011161538/AnsiballZ_stat.py'
Sep 30 21:06:08 compute-0 sudo[163263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:09 compute-0 python3.9[163265]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:09 compute-0 sudo[163263]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:09 compute-0 sudo[163341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awibosawpcrwzilugnthynjnkowsbrqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266368.5151234-694-43176011161538/AnsiballZ_file.py'
Sep 30 21:06:09 compute-0 sudo[163341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:09 compute-0 python3.9[163343]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:09 compute-0 sudo[163341]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:10 compute-0 sudo[163493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubxqarmwgtwhohtjghrdnvqrkytvmsqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266369.9086714-730-195572689299162/AnsiballZ_systemd.py'
Sep 30 21:06:10 compute-0 sudo[163493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:10 compute-0 python3.9[163495]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:10 compute-0 systemd[1]: Reloading.
Sep 30 21:06:10 compute-0 systemd-rc-local-generator[163519]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:10 compute-0 systemd-sysv-generator[163524]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:10 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 21:06:10 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:06:10 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:06:10 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 21:06:10 compute-0 sudo[163493]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:11 compute-0 sudo[163699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apptcbzvquwajnwiwxdgdjqrnrtakgdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266371.44613-760-91548983962298/AnsiballZ_file.py'
Sep 30 21:06:11 compute-0 sudo[163699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:11 compute-0 podman[163660]: 2025-09-30 21:06:11.862190639 +0000 UTC m=+0.083346862 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:06:12 compute-0 python3.9[163707]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:12 compute-0 sudo[163699]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:12 compute-0 sudo[163857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efnqxwwhubwqfsihhewmsdbagmaaivfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266372.329146-784-271863703632374/AnsiballZ_stat.py'
Sep 30 21:06:12 compute-0 sudo[163857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:12 compute-0 python3.9[163859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:12 compute-0 sudo[163857]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:13 compute-0 sudo[163980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxlcdfrbkfnbkkafvjvglopfairisghc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266372.329146-784-271863703632374/AnsiballZ_copy.py'
Sep 30 21:06:13 compute-0 sudo[163980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:13 compute-0 python3.9[163982]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266372.329146-784-271863703632374/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:13 compute-0 sudo[163980]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:14 compute-0 sudo[164132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbvybyalstymbvuhezpmlygrhjpogdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.0601065-835-208090049278717/AnsiballZ_file.py'
Sep 30 21:06:14 compute-0 sudo[164132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:14 compute-0 python3.9[164134]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:14 compute-0 sudo[164132]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:15 compute-0 sudo[164284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgecuaxnvysnujwfmywxbiyrqxnqhzpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.8600605-859-112916072617045/AnsiballZ_stat.py'
Sep 30 21:06:15 compute-0 sudo[164284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:15 compute-0 python3.9[164286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:15 compute-0 sudo[164284]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:15 compute-0 sudo[164407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfhlsagcwsdfrcgodbkvpvcoqfrwsxcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266374.8600605-859-112916072617045/AnsiballZ_copy.py'
Sep 30 21:06:15 compute-0 sudo[164407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:15 compute-0 python3.9[164409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266374.8600605-859-112916072617045/.source.json _original_basename=.jx28j6v0 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:15 compute-0 sudo[164407]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:16 compute-0 sudo[164559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshqncxvatubcvgyolenptmvldsqfhdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266376.3487167-904-166020841810659/AnsiballZ_file.py'
Sep 30 21:06:16 compute-0 sudo[164559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:16 compute-0 python3.9[164561]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:16 compute-0 sudo[164559]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:17 compute-0 sudo[164711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgolcqobfhwavtiqvnxuekfadqnkrqwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266377.1825385-928-204812257667940/AnsiballZ_stat.py'
Sep 30 21:06:17 compute-0 sudo[164711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:17 compute-0 sudo[164711]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:18 compute-0 sudo[164834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txqnvrnnkpkwycwtfdbnahkybzjoewle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266377.1825385-928-204812257667940/AnsiballZ_copy.py'
Sep 30 21:06:18 compute-0 sudo[164834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:18 compute-0 sudo[164834]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:19 compute-0 sudo[164986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgtoluumsjzdlnsvvuklqbajcrugrxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266378.9142776-979-259881949909565/AnsiballZ_container_config_data.py'
Sep 30 21:06:19 compute-0 sudo[164986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:19 compute-0 python3.9[164988]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Sep 30 21:06:19 compute-0 sudo[164986]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:20 compute-0 sudo[165138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeeexcphdpthcoguwewwwjuulnutzubs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266379.9094653-1006-118571545026445/AnsiballZ_container_config_hash.py'
Sep 30 21:06:20 compute-0 sudo[165138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:20 compute-0 python3.9[165140]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:06:20 compute-0 sudo[165138]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:21 compute-0 sudo[165290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovjdowwvjokhjvowycnqmkhyhzqykuxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266381.1140633-1033-201688372307540/AnsiballZ_podman_container_info.py'
Sep 30 21:06:21 compute-0 sudo[165290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:21 compute-0 python3.9[165292]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:06:21 compute-0 sudo[165290]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:23 compute-0 sudo[165468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmnfajphtlkofhkktpeuxkrvurrytga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266382.9065676-1072-61926382678971/AnsiballZ_edpm_container_manage.py'
Sep 30 21:06:23 compute-0 sudo[165468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:23 compute-0 python3[165470]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:06:24 compute-0 podman[165507]: 2025-09-30 21:06:24.054206446 +0000 UTC m=+0.058842970 container create bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:06:24 compute-0 podman[165507]: 2025-09-30 21:06:24.027170771 +0000 UTC m=+0.031807385 image pull 4c2cf735485aec82560a51e8042a9e65bbe194a07c6812512d6a5e2ed955852b quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:06:24 compute-0 python3[165470]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Sep 30 21:06:24 compute-0 sudo[165468]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:24 compute-0 sudo[165697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klozoqpyznypaezwfuzqwgiylwhtzqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266384.4134836-1096-183717888624232/AnsiballZ_stat.py'
Sep 30 21:06:24 compute-0 sudo[165697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:24 compute-0 python3.9[165699]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:24 compute-0 sudo[165697]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:25 compute-0 sudo[165851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuffawxniwsynhthfnilqhwvsldsoxer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266385.4590862-1123-269723316286425/AnsiballZ_file.py'
Sep 30 21:06:25 compute-0 sudo[165851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:26 compute-0 python3.9[165853]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:26 compute-0 sudo[165851]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:26 compute-0 sudo[165927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwticjqsaqffhqdmpbavhqwvgxykdwbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266385.4590862-1123-269723316286425/AnsiballZ_stat.py'
Sep 30 21:06:26 compute-0 sudo[165927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:26 compute-0 python3.9[165929]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:26 compute-0 sudo[165927]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:27 compute-0 sudo[166078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxqqfzacwffvayubcxwugpsphjyqnybq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6864207-1123-108322631877907/AnsiballZ_copy.py'
Sep 30 21:06:27 compute-0 sudo[166078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:27 compute-0 python3.9[166080]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266386.6864207-1123-108322631877907/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:27 compute-0 sudo[166078]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:27 compute-0 sudo[166154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acfwshmvbwbdlmjwokpgjpuwspiyxcwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6864207-1123-108322631877907/AnsiballZ_systemd.py'
Sep 30 21:06:27 compute-0 sudo[166154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:28 compute-0 python3.9[166156]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:06:28 compute-0 systemd[1]: Reloading.
Sep 30 21:06:28 compute-0 systemd-sysv-generator[166187]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:28 compute-0 systemd-rc-local-generator[166184]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:28 compute-0 sudo[166154]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:28 compute-0 sudo[166265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbsngptbkvesirgznpzwdzujnsefgwcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266386.6864207-1123-108322631877907/AnsiballZ_systemd.py'
Sep 30 21:06:28 compute-0 sudo[166265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:28 compute-0 python3.9[166267]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:06:29 compute-0 systemd[1]: Reloading.
Sep 30 21:06:29 compute-0 systemd-rc-local-generator[166297]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:06:29 compute-0 systemd-sysv-generator[166300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:06:29 compute-0 systemd[1]: Starting iscsid container...
Sep 30 21:06:29 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:06:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c1452f6ad735c452b432392131e938b7ccd0efca641b58ad7691315f8e4e8ba/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c1452f6ad735c452b432392131e938b7ccd0efca641b58ad7691315f8e4e8ba/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c1452f6ad735c452b432392131e938b7ccd0efca641b58ad7691315f8e4e8ba/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:06:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672.
Sep 30 21:06:29 compute-0 podman[166306]: 2025-09-30 21:06:29.479877241 +0000 UTC m=+0.138213135 container init bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:06:29 compute-0 iscsid[166321]: + sudo -E kolla_set_configs
Sep 30 21:06:29 compute-0 podman[166306]: 2025-09-30 21:06:29.509025163 +0000 UTC m=+0.167361017 container start bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2)
Sep 30 21:06:29 compute-0 sudo[166328]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:06:29 compute-0 podman[166306]: iscsid
Sep 30 21:06:29 compute-0 systemd[1]: Started iscsid container.
Sep 30 21:06:29 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 21:06:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 21:06:29 compute-0 sudo[166265]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 21:06:29 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 21:06:29 compute-0 systemd[166348]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 21:06:29 compute-0 podman[166327]: 2025-09-30 21:06:29.578070634 +0000 UTC m=+0.060026088 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Sep 30 21:06:29 compute-0 systemd[1]: bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672-11747b7ad949f977.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:06:29 compute-0 systemd[1]: bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672-11747b7ad949f977.service: Failed with result 'exit-code'.
Sep 30 21:06:29 compute-0 systemd[166348]: Queued start job for default target Main User Target.
Sep 30 21:06:29 compute-0 systemd[166348]: Created slice User Application Slice.
Sep 30 21:06:29 compute-0 systemd[166348]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 21:06:29 compute-0 systemd[166348]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:06:29 compute-0 systemd[166348]: Reached target Paths.
Sep 30 21:06:29 compute-0 systemd[166348]: Reached target Timers.
Sep 30 21:06:29 compute-0 systemd[166348]: Starting D-Bus User Message Bus Socket...
Sep 30 21:06:29 compute-0 systemd[166348]: Starting Create User's Volatile Files and Directories...
Sep 30 21:06:29 compute-0 systemd[166348]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:06:29 compute-0 systemd[166348]: Reached target Sockets.
Sep 30 21:06:29 compute-0 systemd[166348]: Finished Create User's Volatile Files and Directories.
Sep 30 21:06:29 compute-0 systemd[166348]: Reached target Basic System.
Sep 30 21:06:29 compute-0 systemd[166348]: Reached target Main User Target.
Sep 30 21:06:29 compute-0 systemd[166348]: Startup finished in 135ms.
Sep 30 21:06:29 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 21:06:29 compute-0 systemd[1]: Started Session c3 of User root.
Sep 30 21:06:29 compute-0 sudo[166328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:06:29 compute-0 iscsid[166321]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:06:29 compute-0 iscsid[166321]: INFO:__main__:Validating config file
Sep 30 21:06:29 compute-0 iscsid[166321]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:06:29 compute-0 iscsid[166321]: INFO:__main__:Writing out command to execute
Sep 30 21:06:29 compute-0 sudo[166328]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:29 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Sep 30 21:06:29 compute-0 iscsid[166321]: ++ cat /run_command
Sep 30 21:06:29 compute-0 iscsid[166321]: + CMD='/usr/sbin/iscsid -f'
Sep 30 21:06:29 compute-0 iscsid[166321]: + ARGS=
Sep 30 21:06:29 compute-0 iscsid[166321]: + sudo kolla_copy_cacerts
Sep 30 21:06:29 compute-0 sudo[166390]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:06:29 compute-0 systemd[1]: Started Session c4 of User root.
Sep 30 21:06:29 compute-0 sudo[166390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:06:29 compute-0 sudo[166390]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:29 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Sep 30 21:06:29 compute-0 iscsid[166321]: + [[ ! -n '' ]]
Sep 30 21:06:29 compute-0 iscsid[166321]: + . kolla_extend_start
Sep 30 21:06:29 compute-0 iscsid[166321]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Sep 30 21:06:29 compute-0 iscsid[166321]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Sep 30 21:06:29 compute-0 iscsid[166321]: Running command: '/usr/sbin/iscsid -f'
Sep 30 21:06:29 compute-0 iscsid[166321]: + umask 0022
Sep 30 21:06:29 compute-0 iscsid[166321]: + exec /usr/sbin/iscsid -f
Sep 30 21:06:29 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Sep 30 21:06:30 compute-0 python3.9[166526]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:31 compute-0 sudo[166676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uefpebjtyphanolzcqobdqxfdltwtmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266391.0754302-1234-195850431523144/AnsiballZ_file.py'
Sep 30 21:06:31 compute-0 sudo[166676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:31 compute-0 python3.9[166678]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:31 compute-0 sudo[166676]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:32 compute-0 sudo[166828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akswpyonfygvmrtzerpddrgheuubfxqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266392.0809686-1267-79039638802018/AnsiballZ_service_facts.py'
Sep 30 21:06:32 compute-0 sudo[166828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:32 compute-0 python3.9[166830]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:06:32 compute-0 network[166847]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:06:32 compute-0 network[166848]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:06:32 compute-0 network[166849]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:06:34 compute-0 sshd-session[165615]: error: kex_exchange_identification: read: Connection timed out
Sep 30 21:06:34 compute-0 sshd-session[165615]: banner exchange: Connection from 14.103.118.213 port 45848: Connection timed out
Sep 30 21:06:37 compute-0 sudo[166828]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:38 compute-0 podman[166996]: 2025-09-30 21:06:38.416113605 +0000 UTC m=+0.142332677 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:06:38.707 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:06:38.708 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:06:38.708 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:06:38 compute-0 sudo[167148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfrzttmyrqdrqhxejhhmycjkuqfmduzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266398.6358411-1297-105070660977279/AnsiballZ_file.py'
Sep 30 21:06:38 compute-0 sudo[167148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:39 compute-0 python3.9[167150]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:06:39 compute-0 sudo[167148]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:39 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 21:06:39 compute-0 systemd[166348]: Activating special unit Exit the Session...
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped target Main User Target.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped target Basic System.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped target Paths.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped target Sockets.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped target Timers.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:06:39 compute-0 systemd[166348]: Closed D-Bus User Message Bus Socket.
Sep 30 21:06:39 compute-0 systemd[166348]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:06:39 compute-0 systemd[166348]: Removed slice User Application Slice.
Sep 30 21:06:39 compute-0 systemd[166348]: Reached target Shutdown.
Sep 30 21:06:39 compute-0 systemd[166348]: Finished Exit the Session.
Sep 30 21:06:39 compute-0 systemd[166348]: Reached target Exit the Session.
Sep 30 21:06:40 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 21:06:40 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 21:06:40 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 21:06:40 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 21:06:40 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 21:06:40 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 21:06:40 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 21:06:40 compute-0 sudo[167303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyyuerfjfhbcvudhjhedkfaxgmgcoryu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266399.4789038-1321-82738622791565/AnsiballZ_modprobe.py'
Sep 30 21:06:40 compute-0 sudo[167303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:40 compute-0 python3.9[167305]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Sep 30 21:06:40 compute-0 sudo[167303]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:40 compute-0 sudo[167459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nloklzgvtallblbhwzcufqkqoocxdoaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266400.6202002-1345-3392232311115/AnsiballZ_stat.py'
Sep 30 21:06:40 compute-0 sudo[167459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:41 compute-0 python3.9[167461]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:41 compute-0 sudo[167459]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:41 compute-0 sudo[167582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzcobjzbvpapqzkpsccwzpztmtphfjex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266400.6202002-1345-3392232311115/AnsiballZ_copy.py'
Sep 30 21:06:41 compute-0 sudo[167582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:41 compute-0 python3.9[167584]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266400.6202002-1345-3392232311115/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:41 compute-0 sudo[167582]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:42 compute-0 podman[167609]: 2025-09-30 21:06:42.356433241 +0000 UTC m=+0.079395689 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 21:06:42 compute-0 sudo[167752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfritpthioyiciejennekesslrauselk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266402.2934933-1393-76285935555625/AnsiballZ_lineinfile.py'
Sep 30 21:06:42 compute-0 sudo[167752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:42 compute-0 python3.9[167754]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:42 compute-0 sudo[167752]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:43 compute-0 sudo[167904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvdbzdkbqdgmzmpcmgwluyuditlshhfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266403.1199822-1417-260108466825177/AnsiballZ_systemd.py'
Sep 30 21:06:43 compute-0 sudo[167904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:43 compute-0 python3.9[167906]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:06:43 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 21:06:43 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 21:06:43 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 21:06:43 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 21:06:43 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 21:06:43 compute-0 sudo[167904]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:44 compute-0 sudo[168060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehflxcjvtigioctwltqoecinljztcqqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266404.218271-1441-166355047258188/AnsiballZ_file.py'
Sep 30 21:06:44 compute-0 sudo[168060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:44 compute-0 python3.9[168062]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:44 compute-0 sudo[168060]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:45 compute-0 sudo[168212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noxekrxiepsdsfkjxdsidjysoeotdokx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266405.221381-1468-251037256506511/AnsiballZ_stat.py'
Sep 30 21:06:45 compute-0 sudo[168212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:45 compute-0 python3.9[168214]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:45 compute-0 sudo[168212]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:46 compute-0 sudo[168364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnpmnavqdtfeefunpprimgiutezpnimf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266406.1430542-1495-20805316652377/AnsiballZ_stat.py'
Sep 30 21:06:46 compute-0 sudo[168364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:46 compute-0 python3.9[168366]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:46 compute-0 sudo[168364]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:47 compute-0 sudo[168516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uezydfggluvgvkuoqyojgdxahdcwlvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266407.0629003-1519-185456152013356/AnsiballZ_stat.py'
Sep 30 21:06:47 compute-0 sudo[168516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:47 compute-0 python3.9[168518]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:47 compute-0 sudo[168516]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:48 compute-0 sudo[168639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvtxnygqxwflpxlwzzrzhgmbydvscnls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266407.0629003-1519-185456152013356/AnsiballZ_copy.py'
Sep 30 21:06:48 compute-0 sudo[168639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:48 compute-0 python3.9[168641]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266407.0629003-1519-185456152013356/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:48 compute-0 sudo[168639]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:49 compute-0 sudo[168791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmqgmalufihhurpxzqexbcnplpovqdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266408.5307653-1564-159064177772205/AnsiballZ_command.py'
Sep 30 21:06:49 compute-0 sudo[168791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:49 compute-0 python3.9[168793]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:06:49 compute-0 sudo[168791]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:49 compute-0 sudo[168944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwbgfguyqpwnywluiznvajhkvrjfxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266409.5407376-1588-278797709556680/AnsiballZ_lineinfile.py'
Sep 30 21:06:49 compute-0 sudo[168944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:50 compute-0 python3.9[168946]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:50 compute-0 sudo[168944]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:50 compute-0 sudo[169096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eozcjecqnucnliufaicmrwqcrdkerqtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266410.3930194-1612-18406493777593/AnsiballZ_replace.py'
Sep 30 21:06:50 compute-0 sudo[169096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:51 compute-0 python3.9[169098]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:51 compute-0 sudo[169096]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:51 compute-0 sudo[169248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kibrfgmffbgsgtcujaclrzgnoskwdprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266411.422161-1636-118963433859564/AnsiballZ_replace.py'
Sep 30 21:06:51 compute-0 sudo[169248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:51 compute-0 python3.9[169250]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:51 compute-0 sudo[169248]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:52 compute-0 sudo[169400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgvoiwmfkcoemplpdgfglukewdkzaaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266412.3135164-1663-135203761545223/AnsiballZ_lineinfile.py'
Sep 30 21:06:52 compute-0 sudo[169400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:52 compute-0 python3.9[169402]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:52 compute-0 sudo[169400]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:52 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Sep 30 21:06:53 compute-0 sudo[169553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkzjklvezqvadovbupbyipvboagyhdne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266412.9210396-1663-252449024951562/AnsiballZ_lineinfile.py'
Sep 30 21:06:53 compute-0 sudo[169553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:53 compute-0 python3.9[169555]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:53 compute-0 sudo[169553]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:53 compute-0 sudo[169705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cebbubzweylvjozbbxdyvasdarrzyobg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266413.6247966-1663-198384019129967/AnsiballZ_lineinfile.py'
Sep 30 21:06:53 compute-0 sudo[169705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 21:06:54 compute-0 python3.9[169707]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:54 compute-0 sudo[169705]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:54 compute-0 sudo[169858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sumtstdnmnzlccafadutnhzmyrkgkjcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266414.364993-1663-18296433034383/AnsiballZ_lineinfile.py'
Sep 30 21:06:54 compute-0 sudo[169858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:54 compute-0 python3.9[169860]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:54 compute-0 sudo[169858]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:55 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Sep 30 21:06:55 compute-0 sudo[170011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfnrhxhktcnswrzqolnabnmciviyczqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266415.3855035-1750-108975872303126/AnsiballZ_stat.py'
Sep 30 21:06:55 compute-0 sudo[170011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:55 compute-0 python3.9[170013]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:06:55 compute-0 sudo[170011]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:56 compute-0 sudo[170165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvjzbvkeutjfajxebsjmhpepdliaafno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266416.2255208-1774-132732027079293/AnsiballZ_file.py'
Sep 30 21:06:56 compute-0 sudo[170165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:56 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 21:06:56 compute-0 python3.9[170167]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:06:56 compute-0 sudo[170165]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:57 compute-0 sudo[170318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oslreuiplukcazxrqpyofwufhkhwluvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266417.207911-1801-246591104001495/AnsiballZ_file.py'
Sep 30 21:06:57 compute-0 sudo[170318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:57 compute-0 python3.9[170320]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:57 compute-0 sudo[170318]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:58 compute-0 sudo[170470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udvhcadbvptxqmqdmujoloqflxkfpluj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266418.038259-1825-110071177631491/AnsiballZ_stat.py'
Sep 30 21:06:58 compute-0 sudo[170470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:58 compute-0 python3.9[170472]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:58 compute-0 sudo[170470]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:58 compute-0 sudo[170548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-metiwlidusulkuzqzjxsuzvmecptcloe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266418.038259-1825-110071177631491/AnsiballZ_file.py'
Sep 30 21:06:58 compute-0 sudo[170548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:59 compute-0 python3.9[170550]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:06:59 compute-0 sudo[170548]: pam_unix(sudo:session): session closed for user root
Sep 30 21:06:59 compute-0 sudo[170700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsbxixvvfepuhxgqsjrhliattuxejgpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266419.280028-1825-224756879607675/AnsiballZ_stat.py'
Sep 30 21:06:59 compute-0 sudo[170700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:06:59 compute-0 podman[170702]: 2025-09-30 21:06:59.702847895 +0000 UTC m=+0.063375371 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:06:59 compute-0 python3.9[170703]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:06:59 compute-0 sudo[170700]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:00 compute-0 sudo[170798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngbgfiajjzlrfzhcsicalgmtclwcukom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266419.280028-1825-224756879607675/AnsiballZ_file.py'
Sep 30 21:07:00 compute-0 sudo[170798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:00 compute-0 python3.9[170800]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:00 compute-0 sudo[170798]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:00 compute-0 sudo[170950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baidbpkywygppebsssmllcqheadwoywx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266420.6294382-1894-259737845944807/AnsiballZ_file.py'
Sep 30 21:07:00 compute-0 sudo[170950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:01 compute-0 python3.9[170952]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:01 compute-0 sudo[170950]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:01 compute-0 sudo[171102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjhouoeflrkvhfpgepyjpshsramptxuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266421.547369-1918-144705378689242/AnsiballZ_stat.py'
Sep 30 21:07:01 compute-0 sudo[171102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:02 compute-0 python3.9[171104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:02 compute-0 sudo[171102]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:02 compute-0 sudo[171180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kndewnsiptktmmwksryfvrrqnmhhylzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266421.547369-1918-144705378689242/AnsiballZ_file.py'
Sep 30 21:07:02 compute-0 sudo[171180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:02 compute-0 python3.9[171182]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:02 compute-0 sudo[171180]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:03 compute-0 sudo[171332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgjgypruwhgdygxbdthujbabueekswsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266422.858895-1954-95496956015272/AnsiballZ_stat.py'
Sep 30 21:07:03 compute-0 sudo[171332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:03 compute-0 python3.9[171334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:03 compute-0 sudo[171332]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:03 compute-0 sudo[171410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpvnmyxdajthfipesmfqmuplyncjsvdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266422.858895-1954-95496956015272/AnsiballZ_file.py'
Sep 30 21:07:03 compute-0 sudo[171410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:03 compute-0 python3.9[171412]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:03 compute-0 sudo[171410]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:04 compute-0 sudo[171562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zphdvrhqfveiftyqqfqtppcvjuuoubfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266424.2743425-1990-88225484088532/AnsiballZ_systemd.py'
Sep 30 21:07:04 compute-0 sudo[171562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:04 compute-0 python3.9[171564]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:04 compute-0 systemd[1]: Reloading.
Sep 30 21:07:05 compute-0 systemd-rc-local-generator[171593]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:05 compute-0 systemd-sysv-generator[171598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:05 compute-0 sudo[171562]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:05 compute-0 sudo[171752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpurywhamkdqckruftlbeuodaslnzvpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266425.5862238-2014-233095610587889/AnsiballZ_stat.py'
Sep 30 21:07:05 compute-0 sudo[171752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:06 compute-0 python3.9[171754]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:06 compute-0 sudo[171752]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:06 compute-0 sudo[171830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oscbxoenhjcncdqitkwofpjrzxodjudp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266425.5862238-2014-233095610587889/AnsiballZ_file.py'
Sep 30 21:07:06 compute-0 sudo[171830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:06 compute-0 python3.9[171832]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:06 compute-0 sudo[171830]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:07 compute-0 sudo[171982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caacwrfedalmefdejxixrtkzfjkcoayz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266426.871823-2050-192019916520864/AnsiballZ_stat.py'
Sep 30 21:07:07 compute-0 sudo[171982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:07 compute-0 python3.9[171984]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:07 compute-0 sudo[171982]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:07 compute-0 sudo[172060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohrufdgkuebqmwntojbjmphzuxrpkks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266426.871823-2050-192019916520864/AnsiballZ_file.py'
Sep 30 21:07:07 compute-0 sudo[172060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:07 compute-0 python3.9[172062]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:07 compute-0 sudo[172060]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:08 compute-0 sudo[172225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbcbwhuyuqkuxkxdxqerzgrtternrkkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266428.2835395-2086-204879060924102/AnsiballZ_systemd.py'
Sep 30 21:07:08 compute-0 sudo[172225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:08 compute-0 podman[172186]: 2025-09-30 21:07:08.635319646 +0000 UTC m=+0.104828489 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Sep 30 21:07:08 compute-0 python3.9[172233]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:08 compute-0 systemd[1]: Reloading.
Sep 30 21:07:08 compute-0 systemd-rc-local-generator[172268]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:08 compute-0 systemd-sysv-generator[172271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:09 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 21:07:09 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 21:07:09 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 21:07:09 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 21:07:09 compute-0 sudo[172225]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:10 compute-0 sudo[172431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzkymdqeoenmianegirprzeahfrpojdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266429.7345283-2116-181595783516159/AnsiballZ_file.py'
Sep 30 21:07:10 compute-0 sudo[172431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:10 compute-0 python3.9[172433]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:10 compute-0 sudo[172431]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:11 compute-0 sudo[172583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sexqvuurlkmwjujcrgsexrszvstqxpcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266430.6566534-2140-32187146828012/AnsiballZ_stat.py'
Sep 30 21:07:11 compute-0 sudo[172583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:11 compute-0 python3.9[172585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:11 compute-0 sudo[172583]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:11 compute-0 sudo[172706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjawuiojhtqrlkjhsrfzuamcinjunttx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266430.6566534-2140-32187146828012/AnsiballZ_copy.py'
Sep 30 21:07:11 compute-0 sudo[172706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:11 compute-0 python3.9[172708]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266430.6566534-2140-32187146828012/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:11 compute-0 sudo[172706]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:13 compute-0 podman[172809]: 2025-09-30 21:07:13.377713933 +0000 UTC m=+0.096954773 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:07:13 compute-0 sudo[172875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqaqtbbhanbphxcftfyxxszfrmbnpbvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.0321863-2191-9060302436576/AnsiballZ_file.py'
Sep 30 21:07:13 compute-0 sudo[172875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:13 compute-0 python3.9[172877]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:07:13 compute-0 sudo[172875]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:14 compute-0 sudo[173027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhiixnmdwoyvzpvvmofjcokcxpleztcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.8364093-2215-68626212054817/AnsiballZ_stat.py'
Sep 30 21:07:14 compute-0 sudo[173027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:14 compute-0 python3.9[173029]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:14 compute-0 sudo[173027]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:15 compute-0 sudo[173150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gukhhzlzxhpkernjxxnmogysmicyzmtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266433.8364093-2215-68626212054817/AnsiballZ_copy.py'
Sep 30 21:07:15 compute-0 sudo[173150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:15 compute-0 python3.9[173152]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266433.8364093-2215-68626212054817/.source.json _original_basename=.kqdl26j7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:15 compute-0 sudo[173150]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:16 compute-0 sudo[173302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eebsijahompsocrduifrbmpfpftcovcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.0501223-2260-5400806212609/AnsiballZ_file.py'
Sep 30 21:07:16 compute-0 sudo[173302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:16 compute-0 python3.9[173304]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:16 compute-0 sudo[173302]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:17 compute-0 sudo[173454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apedyigvubmghclxcikrqzqtodnsdkdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.9840307-2284-79498130097838/AnsiballZ_stat.py'
Sep 30 21:07:17 compute-0 sudo[173454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:17 compute-0 sudo[173454]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:18 compute-0 sshd[128205]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 155759
Sep 30 21:07:18 compute-0 sudo[173577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqvprugivajxfvojfdqkxxcrfajhgatl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266436.9840307-2284-79498130097838/AnsiballZ_copy.py'
Sep 30 21:07:18 compute-0 sudo[173577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:18 compute-0 sudo[173577]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:19 compute-0 sudo[173729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eukujnhspdycgvyyxfjqcwsobshgzume ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266438.944387-2335-191938036605116/AnsiballZ_container_config_data.py'
Sep 30 21:07:19 compute-0 sudo[173729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:19 compute-0 python3.9[173731]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Sep 30 21:07:19 compute-0 sudo[173729]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:20 compute-0 sudo[173881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnvymmxuksbwwwppqscftxiiqixpiniu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266439.8544207-2362-65289786680049/AnsiballZ_container_config_hash.py'
Sep 30 21:07:20 compute-0 sudo[173881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:20 compute-0 python3.9[173883]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:07:20 compute-0 sudo[173881]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:21 compute-0 sudo[174033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odywopwginhhjejbduynfxaljzcxniwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266440.7458692-2389-87913854466567/AnsiballZ_podman_container_info.py'
Sep 30 21:07:21 compute-0 sudo[174033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:21 compute-0 python3.9[174035]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 21:07:21 compute-0 sudo[174033]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:22 compute-0 sudo[174211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjyjgbmhvxjgxxnlcpclutuuboqaonv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266442.4709594-2428-258680601000519/AnsiballZ_edpm_container_manage.py'
Sep 30 21:07:22 compute-0 sudo[174211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:23 compute-0 python3[174213]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:07:23 compute-0 podman[174251]: 2025-09-30 21:07:23.425941568 +0000 UTC m=+0.072318533 container create 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:07:23 compute-0 podman[174251]: 2025-09-30 21:07:23.391792172 +0000 UTC m=+0.038169227 image pull 80aeb93432d60c5f52c5325081f51dbf5658fe1615083ed284852e8f6df43250 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 21:07:23 compute-0 python3[174213]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Sep 30 21:07:23 compute-0 sudo[174211]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:24 compute-0 sudo[174439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cypbrmlupprrqvojjpciueroaiatmeuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266444.1568854-2452-216249056636287/AnsiballZ_stat.py'
Sep 30 21:07:24 compute-0 sudo[174439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:24 compute-0 python3.9[174441]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:24 compute-0 sudo[174439]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:25 compute-0 sudo[174593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpibvijzxyqdlwjbiwsssolgoiyuucdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266445.2044692-2479-106380203749606/AnsiballZ_file.py'
Sep 30 21:07:25 compute-0 sudo[174593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:25 compute-0 python3.9[174595]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:25 compute-0 sudo[174593]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:25 compute-0 sudo[174669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydkemjeyrzfzqxwimezmsdpillmjjfca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266445.2044692-2479-106380203749606/AnsiballZ_stat.py'
Sep 30 21:07:25 compute-0 sudo[174669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:26 compute-0 python3.9[174671]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:26 compute-0 sudo[174669]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:26 compute-0 sudo[174820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xypexbvbtwfxbuoxlkmvaemmlcpymstg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2775223-2479-246209933734300/AnsiballZ_copy.py'
Sep 30 21:07:26 compute-0 sudo[174820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:27 compute-0 python3.9[174822]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266446.2775223-2479-246209933734300/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:27 compute-0 sudo[174820]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:27 compute-0 sudo[174896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-votseponyeoatmykzkfyvceacemzchbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2775223-2479-246209933734300/AnsiballZ_systemd.py'
Sep 30 21:07:27 compute-0 sudo[174896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:27 compute-0 python3.9[174898]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:07:27 compute-0 systemd[1]: Reloading.
Sep 30 21:07:27 compute-0 systemd-sysv-generator[174932]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:27 compute-0 systemd-rc-local-generator[174928]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:27 compute-0 sudo[174896]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:28 compute-0 sudo[175008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rflcsmjduoyqmiuibwlzejybboeneoyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266446.2775223-2479-246209933734300/AnsiballZ_systemd.py'
Sep 30 21:07:28 compute-0 sudo[175008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:28 compute-0 python3.9[175010]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:07:28 compute-0 systemd[1]: Reloading.
Sep 30 21:07:28 compute-0 systemd-rc-local-generator[175036]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:28 compute-0 systemd-sysv-generator[175044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:28 compute-0 systemd[1]: Starting multipathd container...
Sep 30 21:07:29 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:07:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d78f6c69fef804b4551f72adb64b0cef600d65bba3b6de0efb59854011ed59ed/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d78f6c69fef804b4551f72adb64b0cef600d65bba3b6de0efb59854011ed59ed/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.
Sep 30 21:07:29 compute-0 podman[175050]: 2025-09-30 21:07:29.1076787 +0000 UTC m=+0.119966714 container init 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:07:29 compute-0 multipathd[175067]: + sudo -E kolla_set_configs
Sep 30 21:07:29 compute-0 podman[175050]: 2025-09-30 21:07:29.132825746 +0000 UTC m=+0.145113700 container start 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:07:29 compute-0 podman[175050]: multipathd
Sep 30 21:07:29 compute-0 sudo[175073]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:07:29 compute-0 sudo[175073]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:29 compute-0 sudo[175073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:29 compute-0 systemd[1]: Started multipathd container.
Sep 30 21:07:29 compute-0 sudo[175008]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-0 multipathd[175067]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:07:29 compute-0 multipathd[175067]: INFO:__main__:Validating config file
Sep 30 21:07:29 compute-0 multipathd[175067]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:07:29 compute-0 multipathd[175067]: INFO:__main__:Writing out command to execute
Sep 30 21:07:29 compute-0 sudo[175073]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-0 multipathd[175067]: ++ cat /run_command
Sep 30 21:07:29 compute-0 multipathd[175067]: + CMD='/usr/sbin/multipathd -d'
Sep 30 21:07:29 compute-0 multipathd[175067]: + ARGS=
Sep 30 21:07:29 compute-0 multipathd[175067]: + sudo kolla_copy_cacerts
Sep 30 21:07:29 compute-0 sudo[175095]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:07:29 compute-0 sudo[175095]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:29 compute-0 sudo[175095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:29 compute-0 sudo[175095]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:29 compute-0 multipathd[175067]: + [[ ! -n '' ]]
Sep 30 21:07:29 compute-0 multipathd[175067]: + . kolla_extend_start
Sep 30 21:07:29 compute-0 multipathd[175067]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 21:07:29 compute-0 multipathd[175067]: Running command: '/usr/sbin/multipathd -d'
Sep 30 21:07:29 compute-0 multipathd[175067]: + umask 0022
Sep 30 21:07:29 compute-0 multipathd[175067]: + exec /usr/sbin/multipathd -d
Sep 30 21:07:29 compute-0 multipathd[175067]: 3144.917244 | --------start up--------
Sep 30 21:07:29 compute-0 multipathd[175067]: 3144.917266 | read /etc/multipath.conf
Sep 30 21:07:29 compute-0 multipathd[175067]: 3144.925050 | path checkers start up
Sep 30 21:07:29 compute-0 podman[175074]: 2025-09-30 21:07:29.238754955 +0000 UTC m=+0.088618375 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:07:29 compute-0 systemd[1]: 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-22a4400abfc20107.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:07:29 compute-0 systemd[1]: 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-22a4400abfc20107.service: Failed with result 'exit-code'.
Sep 30 21:07:30 compute-0 podman[175130]: 2025-09-30 21:07:30.330031472 +0000 UTC m=+0.062446522 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250923, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:07:31 compute-0 python3.9[175276]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:07:31 compute-0 sudo[175428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnhjtcqtkwizvgzkrkvhwjhtophyvpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266451.3907282-2587-261547342885853/AnsiballZ_command.py'
Sep 30 21:07:31 compute-0 sudo[175428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:31 compute-0 python3.9[175430]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:07:32 compute-0 sudo[175428]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:32 compute-0 sudo[175593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpfzdqlflklmcwgzybuxvhxlrwtgllic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266452.2991972-2611-254598457239853/AnsiballZ_systemd.py'
Sep 30 21:07:32 compute-0 sudo[175593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:32 compute-0 python3.9[175595]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:07:32 compute-0 systemd[1]: Stopping multipathd container...
Sep 30 21:07:33 compute-0 multipathd[175067]: 3148.772831 | exit (signal)
Sep 30 21:07:33 compute-0 multipathd[175067]: 3148.773722 | --------shut down-------
Sep 30 21:07:33 compute-0 systemd[1]: libpod-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.scope: Deactivated successfully.
Sep 30 21:07:33 compute-0 podman[175599]: 2025-09-30 21:07:33.117542763 +0000 UTC m=+0.104500426 container stop 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:07:33 compute-0 podman[175599]: 2025-09-30 21:07:33.143753127 +0000 UTC m=+0.130710850 container died 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Sep 30 21:07:33 compute-0 systemd[1]: 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-22a4400abfc20107.timer: Deactivated successfully.
Sep 30 21:07:33 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.
Sep 30 21:07:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-userdata-shm.mount: Deactivated successfully.
Sep 30 21:07:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-d78f6c69fef804b4551f72adb64b0cef600d65bba3b6de0efb59854011ed59ed-merged.mount: Deactivated successfully.
Sep 30 21:07:33 compute-0 podman[175599]: 2025-09-30 21:07:33.200549096 +0000 UTC m=+0.187506739 container cleanup 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:07:33 compute-0 podman[175599]: multipathd
Sep 30 21:07:33 compute-0 podman[175628]: multipathd
Sep 30 21:07:33 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Sep 30 21:07:33 compute-0 systemd[1]: Stopped multipathd container.
Sep 30 21:07:33 compute-0 systemd[1]: Starting multipathd container...
Sep 30 21:07:33 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:07:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d78f6c69fef804b4551f72adb64b0cef600d65bba3b6de0efb59854011ed59ed/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d78f6c69fef804b4551f72adb64b0cef600d65bba3b6de0efb59854011ed59ed/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:07:33 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.
Sep 30 21:07:33 compute-0 podman[175641]: 2025-09-30 21:07:33.425789087 +0000 UTC m=+0.129136523 container init 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:07:33 compute-0 multipathd[175658]: + sudo -E kolla_set_configs
Sep 30 21:07:33 compute-0 sudo[175664]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:07:33 compute-0 sudo[175664]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:33 compute-0 sudo[175664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:33 compute-0 podman[175641]: 2025-09-30 21:07:33.457658683 +0000 UTC m=+0.161006099 container start 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:07:33 compute-0 podman[175641]: multipathd
Sep 30 21:07:33 compute-0 systemd[1]: Started multipathd container.
Sep 30 21:07:33 compute-0 sudo[175593]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-0 multipathd[175658]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:07:33 compute-0 multipathd[175658]: INFO:__main__:Validating config file
Sep 30 21:07:33 compute-0 multipathd[175658]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:07:33 compute-0 multipathd[175658]: INFO:__main__:Writing out command to execute
Sep 30 21:07:33 compute-0 sudo[175664]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-0 sshd[128205]: drop connection #0 from [49.64.169.153]:46443 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:07:33 compute-0 multipathd[175658]: ++ cat /run_command
Sep 30 21:07:33 compute-0 multipathd[175658]: + CMD='/usr/sbin/multipathd -d'
Sep 30 21:07:33 compute-0 multipathd[175658]: + ARGS=
Sep 30 21:07:33 compute-0 multipathd[175658]: + sudo kolla_copy_cacerts
Sep 30 21:07:33 compute-0 podman[175665]: 2025-09-30 21:07:33.531607503 +0000 UTC m=+0.062308719 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:07:33 compute-0 systemd[1]: 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-5fba4bf3ba787e7c.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:07:33 compute-0 systemd[1]: 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581-5fba4bf3ba787e7c.service: Failed with result 'exit-code'.
Sep 30 21:07:33 compute-0 sudo[175687]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:07:33 compute-0 sudo[175687]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:07:33 compute-0 sudo[175687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 21:07:33 compute-0 sudo[175687]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:33 compute-0 multipathd[175658]: + [[ ! -n '' ]]
Sep 30 21:07:33 compute-0 multipathd[175658]: + . kolla_extend_start
Sep 30 21:07:33 compute-0 multipathd[175658]: Running command: '/usr/sbin/multipathd -d'
Sep 30 21:07:33 compute-0 multipathd[175658]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 21:07:33 compute-0 multipathd[175658]: + umask 0022
Sep 30 21:07:33 compute-0 multipathd[175658]: + exec /usr/sbin/multipathd -d
Sep 30 21:07:33 compute-0 multipathd[175658]: 3149.259105 | --------start up--------
Sep 30 21:07:33 compute-0 multipathd[175658]: 3149.259539 | read /etc/multipath.conf
Sep 30 21:07:33 compute-0 multipathd[175658]: 3149.267156 | path checkers start up
Sep 30 21:07:35 compute-0 sudo[175846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caigpqhqyjjkjcdxvukmchfoqzqxpfwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266455.526577-2635-208015890202228/AnsiballZ_file.py'
Sep 30 21:07:35 compute-0 sudo[175846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:36 compute-0 python3.9[175848]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:36 compute-0 sudo[175846]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:37 compute-0 sudo[175998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxkrhdjkbiqpnudilmdcqozrqyxgusbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266456.7453-2671-94420811736017/AnsiballZ_file.py'
Sep 30 21:07:37 compute-0 sudo[175998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:37 compute-0 python3.9[176000]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 21:07:37 compute-0 sudo[175998]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:37 compute-0 sudo[176150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlcmhdmwovxfuisidumlnxsgvdweltxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266457.502355-2695-236361756201189/AnsiballZ_modprobe.py'
Sep 30 21:07:37 compute-0 sudo[176150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:38 compute-0 python3.9[176152]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Sep 30 21:07:38 compute-0 kernel: Key type psk registered
Sep 30 21:07:38 compute-0 sudo[176150]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:07:38.708 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:07:38.709 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:07:38.709 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:07:38 compute-0 sudo[176328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshdyfhdbcmojimanngxjqmnlzkmxell ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266458.348479-2719-214369422911020/AnsiballZ_stat.py'
Sep 30 21:07:38 compute-0 sudo[176328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:38 compute-0 podman[176287]: 2025-09-30 21:07:38.80420485 +0000 UTC m=+0.114523051 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:07:38 compute-0 python3.9[176335]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:07:38 compute-0 sudo[176328]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:39 compute-0 sudo[176462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxdskbcpqcmyjkkglvovczrmugjgavde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266458.348479-2719-214369422911020/AnsiballZ_copy.py'
Sep 30 21:07:39 compute-0 sudo[176462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:39 compute-0 python3.9[176464]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266458.348479-2719-214369422911020/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:39 compute-0 sudo[176462]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:40 compute-0 sudo[176614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwpeytjcziovnlqgjacckkeuvcvbutnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266459.971195-2767-220769418565910/AnsiballZ_lineinfile.py'
Sep 30 21:07:40 compute-0 sudo[176614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:40 compute-0 python3.9[176616]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:40 compute-0 sudo[176614]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:41 compute-0 sudo[176766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpkpeznwbmvkwpjjuqjgdfynmysocdpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266460.8500285-2791-246140384153021/AnsiballZ_systemd.py'
Sep 30 21:07:41 compute-0 sudo[176766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:41 compute-0 python3.9[176768]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:07:41 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 21:07:41 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 21:07:41 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 21:07:41 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 21:07:41 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 21:07:41 compute-0 sudo[176766]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:42 compute-0 sudo[176922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjeyemaznadysoobstxtelbbhepgfrrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266462.0338154-2815-105114215029115/AnsiballZ_setup.py'
Sep 30 21:07:42 compute-0 sudo[176922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:42 compute-0 python3.9[176924]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 21:07:43 compute-0 sudo[176922]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:43 compute-0 sudo[177018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhcdyyxgygmquawdosalkzsyzwsfiyzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266462.0338154-2815-105114215029115/AnsiballZ_dnf.py'
Sep 30 21:07:43 compute-0 sudo[177018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:43 compute-0 podman[176980]: 2025-09-30 21:07:43.524268466 +0000 UTC m=+0.070758027 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 21:07:43 compute-0 python3.9[177026]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 21:07:49 compute-0 systemd[1]: Reloading.
Sep 30 21:07:49 compute-0 systemd-rc-local-generator[177060]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:49 compute-0 systemd-sysv-generator[177064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:50 compute-0 systemd[1]: Reloading.
Sep 30 21:07:50 compute-0 systemd-rc-local-generator[177091]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:50 compute-0 systemd-sysv-generator[177097]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:50 compute-0 systemd-logind[792]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 21:07:50 compute-0 systemd-logind[792]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 21:07:50 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 21:07:50 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 21:07:50 compute-0 systemd[1]: Reloading.
Sep 30 21:07:50 compute-0 systemd-rc-local-generator[177184]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:50 compute-0 systemd-sysv-generator[177190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:51 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 21:07:51 compute-0 sudo[177018]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:52 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 21:07:52 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 21:07:52 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.718s CPU time.
Sep 30 21:07:52 compute-0 systemd[1]: run-rdef381cc577f4ce5bfea6a1bfd7e5787.service: Deactivated successfully.
Sep 30 21:07:53 compute-0 sudo[178476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edcvuiefguhkytficxuteaojiitnksmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266472.9888117-2851-100400549589646/AnsiballZ_file.py'
Sep 30 21:07:53 compute-0 sudo[178476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:53 compute-0 python3.9[178478]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:53 compute-0 sudo[178476]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:54 compute-0 python3.9[178628]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:07:55 compute-0 sudo[178782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyuhaeyuepbuwajmomduhxzuebjzfhgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266475.238908-2903-105699230625617/AnsiballZ_file.py'
Sep 30 21:07:55 compute-0 sudo[178782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:55 compute-0 python3.9[178784]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:07:55 compute-0 sudo[178782]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:57 compute-0 sudo[178934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jswfuhjivhmomdcnliqlutrxczrqqqzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266476.4167762-2936-275254974689338/AnsiballZ_systemd_service.py'
Sep 30 21:07:57 compute-0 sudo[178934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:07:57 compute-0 python3.9[178936]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:07:57 compute-0 systemd[1]: Reloading.
Sep 30 21:07:57 compute-0 systemd-rc-local-generator[178961]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:07:57 compute-0 systemd-sysv-generator[178967]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:07:57 compute-0 sudo[178934]: pam_unix(sudo:session): session closed for user root
Sep 30 21:07:58 compute-0 python3.9[179121]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:07:58 compute-0 network[179138]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:07:58 compute-0 network[179139]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:07:58 compute-0 network[179140]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:08:00 compute-0 podman[179181]: 2025-09-30 21:08:00.512231037 +0000 UTC m=+0.103415431 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:08:03 compute-0 sudo[179435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftdnqdleyxdlvvbqkssfqralhrosxbml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266482.9927702-2993-131357169325026/AnsiballZ_systemd_service.py'
Sep 30 21:08:03 compute-0 sudo[179435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:03 compute-0 python3.9[179437]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:03 compute-0 sudo[179435]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:03 compute-0 podman[179439]: 2025-09-30 21:08:03.788351373 +0000 UTC m=+0.077185788 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:08:04 compute-0 sudo[179608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fauvjdwehxaxjkdeesauzhiqlcqqfdtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266483.9719625-2993-34345705099336/AnsiballZ_systemd_service.py'
Sep 30 21:08:04 compute-0 sudo[179608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:04 compute-0 python3.9[179610]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:04 compute-0 sudo[179608]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:05 compute-0 sudo[179761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sobiwrrvttmokmhfcrrpfhpiegkctljl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266484.8940575-2993-147810009344965/AnsiballZ_systemd_service.py'
Sep 30 21:08:05 compute-0 sudo[179761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:05 compute-0 python3.9[179763]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:05 compute-0 sudo[179761]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:06 compute-0 sudo[179914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuyxjmqdywhgifmcromyfyzmhhkcyinx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266485.7981658-2993-128133002318976/AnsiballZ_systemd_service.py'
Sep 30 21:08:06 compute-0 sudo[179914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:06 compute-0 python3.9[179916]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:06 compute-0 sudo[179914]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:07 compute-0 sudo[180067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwssfanttoofnlavwhuthgadeskpmcjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266486.701414-2993-52118161997938/AnsiballZ_systemd_service.py'
Sep 30 21:08:07 compute-0 sudo[180067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:07 compute-0 python3.9[180069]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:07 compute-0 sudo[180067]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:07 compute-0 sudo[180220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpwayqsvvwcmdsvufwcoljnerxqdthrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266487.5499947-2993-155779198623980/AnsiballZ_systemd_service.py'
Sep 30 21:08:07 compute-0 sudo[180220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:08 compute-0 python3.9[180222]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:08 compute-0 sudo[180220]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:08 compute-0 sudo[180373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqercfxelzyztmwtlehgghfidjelmqgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266488.3865097-2993-205655519075576/AnsiballZ_systemd_service.py'
Sep 30 21:08:08 compute-0 sudo[180373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:08 compute-0 podman[180375]: 2025-09-30 21:08:08.968465753 +0000 UTC m=+0.097239716 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:08:09 compute-0 python3.9[180376]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:09 compute-0 sudo[180373]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:09 compute-0 sudo[180552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pukimtiszwbspjqgofissenhscvnbwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266489.369784-2993-266879486746410/AnsiballZ_systemd_service.py'
Sep 30 21:08:09 compute-0 sudo[180552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:10 compute-0 python3.9[180554]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:08:10 compute-0 sudo[180552]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:11 compute-0 sudo[180705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcmjszhxrsutrtygypuyigxyrkzxvwoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266490.7243605-3170-208721342244428/AnsiballZ_file.py'
Sep 30 21:08:11 compute-0 sudo[180705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:11 compute-0 python3.9[180707]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:11 compute-0 sudo[180705]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:11 compute-0 sudo[180857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owavyifbsprpxlwmwuqffbaendmskcln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266491.4200206-3170-117986401254073/AnsiballZ_file.py'
Sep 30 21:08:11 compute-0 sudo[180857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:11 compute-0 python3.9[180859]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:11 compute-0 sudo[180857]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:12 compute-0 sudo[181009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odngwbaawtieuzrdscocffhrwoujwmvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266492.1453779-3170-89324252322433/AnsiballZ_file.py'
Sep 30 21:08:12 compute-0 sudo[181009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:12 compute-0 python3.9[181011]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:12 compute-0 sudo[181009]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:13 compute-0 sudo[181161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qngflwpmzyjljzojzalruyrsyvpftylr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266492.9024775-3170-195683053224706/AnsiballZ_file.py'
Sep 30 21:08:13 compute-0 sudo[181161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:13 compute-0 python3.9[181163]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:13 compute-0 sudo[181161]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:14 compute-0 sudo[181324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avlyzepuhcnmrvoebfbgkkxxbqlibzji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266493.6614113-3170-251307836841145/AnsiballZ_file.py'
Sep 30 21:08:14 compute-0 sudo[181324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:14 compute-0 podman[181287]: 2025-09-30 21:08:14.063652289 +0000 UTC m=+0.085135773 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Sep 30 21:08:14 compute-0 python3.9[181332]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:14 compute-0 sudo[181324]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:14 compute-0 sudo[181482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlttqmcdbkkqajfvzrmhmarljassyrqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266494.4316392-3170-170541738450103/AnsiballZ_file.py'
Sep 30 21:08:14 compute-0 sudo[181482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:14 compute-0 python3.9[181484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:15 compute-0 sudo[181482]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:15 compute-0 sudo[181634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmlufwiudawkqurszrnsofmwhdwqsul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266495.1610096-3170-152330229339256/AnsiballZ_file.py'
Sep 30 21:08:15 compute-0 sudo[181634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:15 compute-0 python3.9[181636]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:15 compute-0 sudo[181634]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:16 compute-0 sudo[181786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-renwczxbtauljajuphxnlsccfmffrbfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266495.8685164-3170-112996807950804/AnsiballZ_file.py'
Sep 30 21:08:16 compute-0 sudo[181786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:16 compute-0 python3.9[181788]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:16 compute-0 sudo[181786]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:17 compute-0 sudo[181938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-insnufupbmbpkorlbvgekmtlqpwbqrwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266496.7291536-3341-115655877783068/AnsiballZ_file.py'
Sep 30 21:08:17 compute-0 sudo[181938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:17 compute-0 python3.9[181940]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:17 compute-0 sudo[181938]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:17 compute-0 sudo[182092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aehavsfnohjdaylxllmumthekuzmktfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266497.4526844-3341-30235041916825/AnsiballZ_file.py'
Sep 30 21:08:17 compute-0 sudo[182092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:18 compute-0 python3.9[182094]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:18 compute-0 sudo[182092]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:18 compute-0 sudo[182244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxleepoonxieapsdjmitsfwrzlthklmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266498.2545345-3341-31243278254095/AnsiballZ_file.py'
Sep 30 21:08:18 compute-0 sudo[182244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:18 compute-0 python3.9[182246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:18 compute-0 sudo[182244]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:18 compute-0 sshd-session[182064]: Invalid user palserver from 113.240.110.90 port 33472
Sep 30 21:08:18 compute-0 sshd-session[182064]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:08:18 compute-0 sshd-session[182064]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.240.110.90
Sep 30 21:08:19 compute-0 auditd[705]: Audit daemon rotating log files
Sep 30 21:08:19 compute-0 sudo[182396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjtdcrbdltqgogqjcajgudqneashxstc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266499.0325623-3341-253923872546144/AnsiballZ_file.py'
Sep 30 21:08:19 compute-0 sudo[182396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:19 compute-0 python3.9[182398]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:19 compute-0 sudo[182396]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:20 compute-0 sudo[182548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avxpdihevknpwrokaeozgfbtoycdxogp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266499.726575-3341-48499724188543/AnsiballZ_file.py'
Sep 30 21:08:20 compute-0 sudo[182548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:20 compute-0 python3.9[182550]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:20 compute-0 sudo[182548]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:20 compute-0 sudo[182700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrqjszaagxixrfcycholqvrvevrhtuci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266500.4708102-3341-236054612895526/AnsiballZ_file.py'
Sep 30 21:08:20 compute-0 sudo[182700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:21 compute-0 python3.9[182702]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:21 compute-0 sudo[182700]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:21 compute-0 sudo[182852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvwhntekzlggckrwdauomeojxohwntmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266501.2560952-3341-151293458781364/AnsiballZ_file.py'
Sep 30 21:08:21 compute-0 sudo[182852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:21 compute-0 sshd-session[182064]: Failed password for invalid user palserver from 113.240.110.90 port 33472 ssh2
Sep 30 21:08:21 compute-0 python3.9[182854]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:21 compute-0 sudo[182852]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:22 compute-0 sudo[183004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geidpayggsxgdycxbnmytiisywgreoyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266501.9486985-3341-246369574776897/AnsiballZ_file.py'
Sep 30 21:08:22 compute-0 sudo[183004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:22 compute-0 python3.9[183006]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:22 compute-0 sudo[183004]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:23 compute-0 sudo[183156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slitpywrwqtgtfldqisyjmoyfgskqhbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266503.0830214-3515-269377436854864/AnsiballZ_command.py'
Sep 30 21:08:23 compute-0 sudo[183156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:23 compute-0 python3.9[183158]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:23 compute-0 sudo[183156]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:24 compute-0 python3.9[183310]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:08:25 compute-0 sudo[183460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfjvafcbscfqyezmohynbrkqkskgvqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266505.0177352-3569-171471567226689/AnsiballZ_systemd_service.py'
Sep 30 21:08:25 compute-0 sudo[183460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:25 compute-0 python3.9[183462]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:08:25 compute-0 systemd[1]: Reloading.
Sep 30 21:08:25 compute-0 systemd-sysv-generator[183492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:08:25 compute-0 systemd-rc-local-generator[183488]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:08:25 compute-0 sudo[183460]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:26 compute-0 sudo[183646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arhvyqhakanmjrfljctzhpssjtvshpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266506.4183908-3593-197084433548150/AnsiballZ_command.py'
Sep 30 21:08:26 compute-0 sudo[183646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:26 compute-0 python3.9[183648]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:26 compute-0 sudo[183646]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:27 compute-0 sudo[183799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfzohkgludpludwcozwdgirycfnahjzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266507.1652145-3593-85570260006784/AnsiballZ_command.py'
Sep 30 21:08:27 compute-0 sudo[183799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:27 compute-0 python3.9[183801]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:27 compute-0 sudo[183799]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:28 compute-0 sudo[183952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsnpfenwsxvtxiodkaslegvsyozyxqfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266507.8784611-3593-224752608693605/AnsiballZ_command.py'
Sep 30 21:08:28 compute-0 sudo[183952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:28 compute-0 python3.9[183954]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:28 compute-0 sudo[183952]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:28 compute-0 sudo[184105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrnglpeivoytaidquoaqztmbygodykty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266508.6731417-3593-149232723007206/AnsiballZ_command.py'
Sep 30 21:08:28 compute-0 sudo[184105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:29 compute-0 python3.9[184107]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:29 compute-0 sudo[184105]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:29 compute-0 sudo[184258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybmshcfaanakkkfdwgggmdhjdgvtfyye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266509.3890069-3593-76756098815902/AnsiballZ_command.py'
Sep 30 21:08:29 compute-0 sudo[184258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:29 compute-0 python3.9[184260]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:29 compute-0 sudo[184258]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:30 compute-0 sudo[184411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smdtyopfgmrtobxtjkjrjfmobgugeeja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266510.0914586-3593-133337052786608/AnsiballZ_command.py'
Sep 30 21:08:30 compute-0 sudo[184411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:30 compute-0 python3.9[184413]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:30 compute-0 sudo[184411]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:30 compute-0 podman[184415]: 2025-09-30 21:08:30.724457993 +0000 UTC m=+0.053489453 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 21:08:31 compute-0 sudo[184585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyzdjylodqsyklxxcfwqtjfqdpzuyjei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266510.8392851-3593-277547845393923/AnsiballZ_command.py'
Sep 30 21:08:31 compute-0 sudo[184585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:31 compute-0 python3.9[184587]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:31 compute-0 sudo[184585]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:31 compute-0 sudo[184738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-meewugywntyrtprvpfnjpbdgwnmsukkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266511.509165-3593-236091368366632/AnsiballZ_command.py'
Sep 30 21:08:31 compute-0 sudo[184738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:31 compute-0 python3.9[184740]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:08:32 compute-0 sudo[184738]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:34 compute-0 sudo[184907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogiwldlnktmgpusmmiguhzrxcfhvwaox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266513.9269655-3800-90255131164011/AnsiballZ_file.py'
Sep 30 21:08:34 compute-0 sudo[184907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:34 compute-0 podman[184865]: 2025-09-30 21:08:34.311100549 +0000 UTC m=+0.082632069 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:08:34 compute-0 python3.9[184912]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:34 compute-0 sudo[184907]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:34 compute-0 sudo[185062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdydghoykfadivbdmigsfqugwxwidyog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266514.6529474-3800-8090935467178/AnsiballZ_file.py'
Sep 30 21:08:34 compute-0 sudo[185062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:35 compute-0 python3.9[185064]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:35 compute-0 sudo[185062]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:35 compute-0 sudo[185214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofcwlvyqrawoxbazwecqtnmfjxjouuyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266515.3437188-3800-112357911126185/AnsiballZ_file.py'
Sep 30 21:08:35 compute-0 sudo[185214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:35 compute-0 python3.9[185216]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:35 compute-0 sudo[185214]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:36 compute-0 sudo[185366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwynwfsaipwagwksvcvfoqksrkvtyidv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266516.0869446-3866-210858811834767/AnsiballZ_file.py'
Sep 30 21:08:36 compute-0 sudo[185366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:36 compute-0 python3.9[185368]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:36 compute-0 sudo[185366]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:37 compute-0 sudo[185518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyfnfaguzvrhespbtonrvwysshytwhqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266516.8269587-3866-138130305361511/AnsiballZ_file.py'
Sep 30 21:08:37 compute-0 sudo[185518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:37 compute-0 python3.9[185520]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:37 compute-0 sudo[185518]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:37 compute-0 sudo[185670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptvztdfznkffweirffcxcqwfypmsahxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266517.5178893-3866-129103053225849/AnsiballZ_file.py'
Sep 30 21:08:37 compute-0 sudo[185670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:38 compute-0 python3.9[185672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:38 compute-0 sudo[185670]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:38 compute-0 sudo[185822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbvzzzttvtjfkhhrhzldzvgaelwodjth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266518.2904043-3866-15443924380970/AnsiballZ_file.py'
Sep 30 21:08:38 compute-0 sudo[185822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:08:38.710 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:08:38.711 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:08:38.711 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:08:38 compute-0 python3.9[185824]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:38 compute-0 sudo[185822]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:39 compute-0 sudo[185996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbswyodozxidsrbmmpnvkmoycsgllsyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266519.0199327-3866-15613602310192/AnsiballZ_file.py'
Sep 30 21:08:39 compute-0 sudo[185996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:39 compute-0 podman[185937]: 2025-09-30 21:08:39.36604691 +0000 UTC m=+0.105330298 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:08:39 compute-0 python3.9[186002]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:39 compute-0 sudo[185996]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:40 compute-0 sudo[186152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmjjyjmlmyebmlbywzhnqsgudxildmdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266519.737974-3866-276386132508007/AnsiballZ_file.py'
Sep 30 21:08:40 compute-0 sudo[186152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:40 compute-0 python3.9[186154]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:40 compute-0 sudo[186152]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:40 compute-0 sudo[186304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bozjbucgbxackhdqewxcxzwmlwrmddyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266520.469409-3866-185036935050836/AnsiballZ_file.py'
Sep 30 21:08:40 compute-0 sudo[186304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:40 compute-0 python3.9[186306]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:40 compute-0 sudo[186304]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:41 compute-0 sudo[186456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igtjdqvcxbrlfaqvvktkcvlfojcberzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266521.1194825-3866-259065967213967/AnsiballZ_file.py'
Sep 30 21:08:41 compute-0 sudo[186456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:41 compute-0 python3.9[186458]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:41 compute-0 sudo[186456]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:42 compute-0 sudo[186608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omtgwnxfyuayamxssdssqpvhdvkmzgii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266521.8870106-3866-161704250526833/AnsiballZ_file.py'
Sep 30 21:08:42 compute-0 sudo[186608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:42 compute-0 python3.9[186610]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:42 compute-0 sudo[186608]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:43 compute-0 unix_chkpwd[186637]: password check failed for user (root)
Sep 30 21:08:43 compute-0 sshd-session[186635]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:08:44 compute-0 sshd[128205]: drop connection #2 from [49.64.169.153]:53794 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:08:44 compute-0 podman[186638]: 2025-09-30 21:08:44.364899054 +0000 UTC m=+0.091070293 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 21:08:45 compute-0 sshd-session[186635]: Failed password for root from 45.81.23.80 port 36766 ssh2
Sep 30 21:08:45 compute-0 sshd-session[186635]: Received disconnect from 45.81.23.80 port 36766:11: Bye Bye [preauth]
Sep 30 21:08:45 compute-0 sshd-session[186635]: Disconnected from authenticating user root 45.81.23.80 port 36766 [preauth]
Sep 30 21:08:47 compute-0 sudo[186783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luoktbmzlublpnuzaueehvicmorcbuec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266527.1895573-4213-36555076952700/AnsiballZ_getent.py'
Sep 30 21:08:47 compute-0 sudo[186783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:47 compute-0 python3.9[186785]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Sep 30 21:08:47 compute-0 sudo[186783]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:48 compute-0 sudo[186936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sytxpndkiscdbutbzewbgkhshlqoglcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266528.182517-4237-223765281341395/AnsiballZ_group.py'
Sep 30 21:08:48 compute-0 sudo[186936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:48 compute-0 python3.9[186938]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:08:48 compute-0 groupadd[186939]: group added to /etc/group: name=nova, GID=42436
Sep 30 21:08:48 compute-0 groupadd[186939]: group added to /etc/gshadow: name=nova
Sep 30 21:08:48 compute-0 groupadd[186939]: new group: name=nova, GID=42436
Sep 30 21:08:49 compute-0 sudo[186936]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:49 compute-0 sudo[187094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqpcgwszkcuakyglglhruhtbpqiutqra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266529.361375-4261-162231407199300/AnsiballZ_user.py'
Sep 30 21:08:49 compute-0 sudo[187094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:50 compute-0 python3.9[187096]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:08:50 compute-0 useradd[187098]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Sep 30 21:08:50 compute-0 useradd[187098]: add 'nova' to group 'libvirt'
Sep 30 21:08:50 compute-0 useradd[187098]: add 'nova' to shadow group 'libvirt'
Sep 30 21:08:50 compute-0 sudo[187094]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:51 compute-0 sshd-session[187129]: Accepted publickey for zuul from 192.168.122.30 port 32876 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:08:51 compute-0 systemd-logind[792]: New session 26 of user zuul.
Sep 30 21:08:51 compute-0 systemd[1]: Started Session 26 of User zuul.
Sep 30 21:08:51 compute-0 sshd-session[187129]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:08:51 compute-0 sshd-session[187132]: Received disconnect from 192.168.122.30 port 32876:11: disconnected by user
Sep 30 21:08:51 compute-0 sshd-session[187132]: Disconnected from user zuul 192.168.122.30 port 32876
Sep 30 21:08:51 compute-0 sshd-session[187129]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:08:51 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Sep 30 21:08:51 compute-0 systemd-logind[792]: Session 26 logged out. Waiting for processes to exit.
Sep 30 21:08:51 compute-0 systemd-logind[792]: Removed session 26.
Sep 30 21:08:52 compute-0 python3.9[187282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:52 compute-0 python3.9[187403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266531.616073-4336-26223425840199/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:53 compute-0 python3.9[187553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:53 compute-0 python3.9[187629]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:54 compute-0 python3.9[187779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:55 compute-0 python3.9[187900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266533.9924817-4336-57309419164300/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:55 compute-0 python3.9[188050]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:56 compute-0 python3.9[188171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266535.2611082-4336-3797145071844/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:56 compute-0 python3.9[188321]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:08:57 compute-0 python3.9[188442]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266536.4411-4336-38652466001011/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:08:58 compute-0 sudo[188592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mttuikkvupkpskannfklgofsfbgygath ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266538.3681707-4543-177774033621787/AnsiballZ_file.py'
Sep 30 21:08:58 compute-0 sudo[188592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:58 compute-0 python3.9[188594]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:58 compute-0 sudo[188592]: pam_unix(sudo:session): session closed for user root
Sep 30 21:08:59 compute-0 sudo[188744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szoozbeqsondfpitrlyhfldmgovhbnav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266539.1832094-4567-252854854212052/AnsiballZ_copy.py'
Sep 30 21:08:59 compute-0 sudo[188744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:08:59 compute-0 python3.9[188746]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:08:59 compute-0 sudo[188744]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:00 compute-0 sudo[188896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fptqdrauppdqlagroskvwsojjhvbotto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266539.9933207-4591-20492041572330/AnsiballZ_stat.py'
Sep 30 21:09:00 compute-0 sudo[188896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:00 compute-0 python3.9[188898]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:00 compute-0 sudo[188896]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:01 compute-0 sudo[189065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btdghnzkkmorndrfoefhzxhyuxlrvqgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266540.808637-4615-159763285283551/AnsiballZ_stat.py'
Sep 30 21:09:01 compute-0 sudo[189065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:01 compute-0 podman[189022]: 2025-09-30 21:09:01.174206171 +0000 UTC m=+0.068445546 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:09:01 compute-0 python3.9[189070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:01 compute-0 sudo[189065]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:01 compute-0 sudo[189191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvtscxznivhfrxbzvkdtbhkikeuadana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266540.808637-4615-159763285283551/AnsiballZ_copy.py'
Sep 30 21:09:01 compute-0 sudo[189191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:02 compute-0 python3.9[189193]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759266540.808637-4615-159763285283551/.source _original_basename=.an_2d5cv follow=False checksum=1ff2bf3af1349ca342e8f5aebfc9c982dd7af3f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Sep 30 21:09:02 compute-0 sudo[189191]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:02 compute-0 python3.9[189345]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:03 compute-0 python3.9[189497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:04 compute-0 python3.9[189618]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266543.3445451-4693-259070533801931/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:05 compute-0 podman[189742]: 2025-09-30 21:09:05.057524199 +0000 UTC m=+0.082336262 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:09:05 compute-0 python3.9[189787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:05 compute-0 python3.9[189909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266544.7046776-4738-271939953085503/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:06 compute-0 sudo[190059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjviygsbbvjxiztgzjxxhiaotijtbzzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266546.4576287-4789-144590272552941/AnsiballZ_container_config_data.py'
Sep 30 21:09:06 compute-0 sudo[190059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:07 compute-0 python3.9[190061]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Sep 30 21:09:07 compute-0 sudo[190059]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:07 compute-0 sudo[190211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqqxkzwjdxxhhvhnzyqqfnanpaxuummz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266547.4180343-4816-179966479165859/AnsiballZ_container_config_hash.py'
Sep 30 21:09:07 compute-0 sudo[190211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:07 compute-0 python3.9[190213]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:09:07 compute-0 sudo[190211]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:08 compute-0 sudo[190363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypdywndzpeozaiqnifonesxcpibuwqmt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266548.4033883-4846-184973459250247/AnsiballZ_edpm_container_manage.py'
Sep 30 21:09:08 compute-0 sudo[190363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:09 compute-0 python3[190365]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:09:09 compute-0 podman[190403]: 2025-09-30 21:09:09.293488255 +0000 UTC m=+0.070024455 container create 5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 21:09:09 compute-0 podman[190403]: 2025-09-30 21:09:09.255589318 +0000 UTC m=+0.032125608 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 21:09:09 compute-0 python3[190365]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Sep 30 21:09:09 compute-0 sudo[190363]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:09 compute-0 sudo[190602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pigkqruyefrcjogpchzjowmuiximajwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266549.6649485-4870-142553675669542/AnsiballZ_stat.py'
Sep 30 21:09:09 compute-0 sudo[190602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:10 compute-0 podman[190565]: 2025-09-30 21:09:10.007662675 +0000 UTC m=+0.094178399 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:10 compute-0 python3.9[190614]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:10 compute-0 sudo[190602]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:11 compute-0 sudo[190771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrpndmfsdiajwhpoxvkmnobwdjwtyxxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266550.8672688-4906-95527373056407/AnsiballZ_container_config_data.py'
Sep 30 21:09:11 compute-0 sudo[190771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:11 compute-0 python3.9[190773]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Sep 30 21:09:11 compute-0 sudo[190771]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:12 compute-0 sudo[190923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrcfmfphodxbmphczklvaljzgjemrofo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266551.8548043-4933-395121805997/AnsiballZ_container_config_hash.py'
Sep 30 21:09:12 compute-0 sudo[190923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:12 compute-0 python3.9[190925]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:09:12 compute-0 sudo[190923]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:13 compute-0 sudo[191075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcucrzwifytgclvmoyeplreazsovawvv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266552.8816695-4963-257795188117249/AnsiballZ_edpm_container_manage.py'
Sep 30 21:09:13 compute-0 sudo[191075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:13 compute-0 python3[191077]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:09:13 compute-0 podman[191115]: 2025-09-30 21:09:13.65891002 +0000 UTC m=+0.068468127 container create 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Sep 30 21:09:13 compute-0 podman[191115]: 2025-09-30 21:09:13.620695126 +0000 UTC m=+0.030253313 image pull 613e2b735827096139e990f475c5ac5de0e55d8048941a4521c0c17a4351e975 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Sep 30 21:09:13 compute-0 python3[191077]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Sep 30 21:09:13 compute-0 sudo[191075]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:14 compute-0 sudo[191319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqfhkereefajuqrgrpdsflgasgqgievc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266554.4090197-4987-91396171632402/AnsiballZ_stat.py'
Sep 30 21:09:14 compute-0 sudo[191319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:14 compute-0 podman[191278]: 2025-09-30 21:09:14.795716221 +0000 UTC m=+0.075881196 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:14 compute-0 python3.9[191325]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:15 compute-0 sudo[191319]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:15 compute-0 sudo[191477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eympyhsxgkpbyiaqgmbrotabouemzexy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266555.3734167-5014-141751813483562/AnsiballZ_file.py'
Sep 30 21:09:15 compute-0 sudo[191477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:15 compute-0 python3.9[191479]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:15 compute-0 sudo[191477]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:16 compute-0 sudo[191628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tejuwrnihxcpdakoczdksqrqnceuwfxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266556.0159202-5014-91413147904376/AnsiballZ_copy.py'
Sep 30 21:09:16 compute-0 sudo[191628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:16 compute-0 python3.9[191630]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266556.0159202-5014-91413147904376/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:16 compute-0 sudo[191628]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:17 compute-0 sudo[191704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snzgsrkncqdualsowrxtlyaoedkgymvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266556.0159202-5014-91413147904376/AnsiballZ_systemd.py'
Sep 30 21:09:17 compute-0 sudo[191704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:17 compute-0 python3.9[191706]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:17 compute-0 systemd[1]: Reloading.
Sep 30 21:09:17 compute-0 systemd-rc-local-generator[191733]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:17 compute-0 systemd-sysv-generator[191736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:17 compute-0 sudo[191704]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:17 compute-0 sudo[191814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nizgxfuocpqsxxqykjetfseordqejjim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266556.0159202-5014-91413147904376/AnsiballZ_systemd.py'
Sep 30 21:09:17 compute-0 sudo[191814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:18 compute-0 python3.9[191816]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:09:18 compute-0 systemd[1]: Reloading.
Sep 30 21:09:18 compute-0 systemd-rc-local-generator[191847]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:18 compute-0 systemd-sysv-generator[191851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:18 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 21:09:18 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:18 compute-0 podman[191856]: 2025-09-30 21:09:18.82010397 +0000 UTC m=+0.127003002 container init 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:09:18 compute-0 podman[191856]: 2025-09-30 21:09:18.832406308 +0000 UTC m=+0.139305310 container start 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:09:18 compute-0 podman[191856]: nova_compute
Sep 30 21:09:18 compute-0 nova_compute[191871]: + sudo -E kolla_set_configs
Sep 30 21:09:18 compute-0 systemd[1]: Started nova_compute container.
Sep 30 21:09:18 compute-0 sudo[191814]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Validating config file
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying service configuration files
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Deleting /etc/ceph
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Creating directory /etc/ceph
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Writing out command to execute
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:18 compute-0 nova_compute[191871]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:18 compute-0 nova_compute[191871]: ++ cat /run_command
Sep 30 21:09:18 compute-0 nova_compute[191871]: + CMD=nova-compute
Sep 30 21:09:18 compute-0 nova_compute[191871]: + ARGS=
Sep 30 21:09:18 compute-0 nova_compute[191871]: + sudo kolla_copy_cacerts
Sep 30 21:09:18 compute-0 nova_compute[191871]: + [[ ! -n '' ]]
Sep 30 21:09:18 compute-0 nova_compute[191871]: + . kolla_extend_start
Sep 30 21:09:18 compute-0 nova_compute[191871]: Running command: 'nova-compute'
Sep 30 21:09:18 compute-0 nova_compute[191871]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 21:09:18 compute-0 nova_compute[191871]: + umask 0022
Sep 30 21:09:18 compute-0 nova_compute[191871]: + exec nova-compute
Sep 30 21:09:20 compute-0 python3.9[192033]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:20 compute-0 nova_compute[191871]: 2025-09-30 21:09:20.860 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-0 nova_compute[191871]: 2025-09-30 21:09:20.860 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-0 nova_compute[191871]: 2025-09-30 21:09:20.861 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:20 compute-0 nova_compute[191871]: 2025-09-30 21:09:20.861 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 21:09:20 compute-0 nova_compute[191871]: 2025-09-30 21:09:20.982 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.006 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.535 2 INFO nova.virt.driver [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 21:09:21 compute-0 python3.9[192187]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.650 2 INFO nova.compute.provider_config [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.666 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.667 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.667 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.667 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.668 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.669 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.670 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.671 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.672 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.673 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.674 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.675 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.676 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.677 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.678 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.679 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.680 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.681 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.682 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.683 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.684 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.685 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.686 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.687 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.688 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.689 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.690 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.691 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.692 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.693 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.694 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.695 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.696 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.697 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.698 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.699 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.700 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.701 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.702 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.703 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.704 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.705 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.706 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.707 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.708 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.709 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.710 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.711 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.712 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.713 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.714 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.715 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.716 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.717 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.718 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.719 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.720 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.721 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.722 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.723 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.724 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.725 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.726 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.727 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.728 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.729 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.730 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.731 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.732 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.733 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.734 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.735 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.736 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.737 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.738 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.739 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.740 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.741 2 WARNING oslo_config.cfg [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 21:09:21 compute-0 nova_compute[191871]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 21:09:21 compute-0 nova_compute[191871]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 21:09:21 compute-0 nova_compute[191871]: and ``live_migration_inbound_addr`` respectively.
Sep 30 21:09:21 compute-0 nova_compute[191871]: ).  Its value may be silently ignored in the future.
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.741 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.742 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.743 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.744 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.745 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.746 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.747 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.748 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.749 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.750 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.751 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.752 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.753 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.754 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.755 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.756 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.757 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.758 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.759 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.760 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.761 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.762 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.763 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.764 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.765 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.766 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.767 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.768 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.769 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.770 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.771 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.772 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.773 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.774 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.775 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.776 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.777 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.778 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.778 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.778 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.778 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.778 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.779 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.780 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.781 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.782 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.782 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.782 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.782 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.782 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.783 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.784 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.785 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.786 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.787 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.788 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.789 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.790 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.791 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.792 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.793 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.794 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.795 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.796 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.797 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.798 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.799 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.800 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.801 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.802 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.803 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.803 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.803 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.803 2 DEBUG oslo_service.service [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.804 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.822 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.823 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.823 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.823 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Sep 30 21:09:21 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 21:09:21 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.893 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3b88578fd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.897 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3b88578fd0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.898 2 INFO nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Connection event '1' reason 'None'
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.931 2 WARNING nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 21:09:21 compute-0 nova_compute[191871]: 2025-09-30 21:09:21.932 2 DEBUG nova.virt.libvirt.volume.mount [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 21:09:22 compute-0 python3.9[192389]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.774 2 INFO nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]: 
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <host>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <uuid>5322d6ed-1aa0-443a-8ad6-efe9323295c5</uuid>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <arch>x86_64</arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <microcode version='16777317'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <signature family='23' model='49' stepping='0'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='x2apic'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='tsc-deadline'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='osxsave'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='hypervisor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='tsc_adjust'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='spec-ctrl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='stibp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='arch-capabilities'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='cmp_legacy'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='topoext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='virt-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='lbrv'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='tsc-scale'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='vmcb-clean'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='pause-filter'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='pfthreshold'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='svme-addr-chk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='rdctl-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='mds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature name='pschange-mc-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <pages unit='KiB' size='4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <pages unit='KiB' size='2048'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <pages unit='KiB' size='1048576'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <power_management>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <suspend_mem/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <suspend_disk/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <suspend_hybrid/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </power_management>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <iommu support='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <migration_features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <live/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <uri_transports>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <uri_transport>tcp</uri_transport>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <uri_transport>rdma</uri_transport>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </uri_transports>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </migration_features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <topology>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <cells num='1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <cell id='0'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <memory unit='KiB'>7864108</memory>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <pages unit='KiB' size='4'>1966027</pages>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <distances>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <sibling id='0' value='10'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           </distances>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           <cpus num='8'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:           </cpus>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         </cell>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </cells>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </topology>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <cache>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </cache>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <secmodel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model>selinux</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <doi>0</doi>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </secmodel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <secmodel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model>dac</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <doi>0</doi>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </secmodel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </host>
Sep 30 21:09:22 compute-0 nova_compute[191871]: 
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <guest>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <os_type>hvm</os_type>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <arch name='i686'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <wordsize>32</wordsize>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <domain type='qemu'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <domain type='kvm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <pae/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <nonpae/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <apic default='on' toggle='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <cpuselection/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <deviceboot/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <externalSnapshot/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </guest>
Sep 30 21:09:22 compute-0 nova_compute[191871]: 
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <guest>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <os_type>hvm</os_type>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <arch name='x86_64'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <wordsize>64</wordsize>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <domain type='qemu'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <domain type='kvm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <apic default='on' toggle='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <cpuselection/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <deviceboot/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <externalSnapshot/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </guest>
Sep 30 21:09:22 compute-0 nova_compute[191871]: 
Sep 30 21:09:22 compute-0 nova_compute[191871]: </capabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]: 
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.780 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.798 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 21:09:22 compute-0 nova_compute[191871]: <domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <arch>i686</arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <vcpu max='4096'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <os supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <loader supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>rom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pflash</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='readonly'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>yes</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='secure'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </loader>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </os>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>file</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>anonymous</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>memfd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </memoryBacking>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <disk supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>disk</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cdrom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>floppy</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>lun</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>fdc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>sata</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </disk>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vnc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>dbus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </graphics>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <video supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='modelType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vga</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cirrus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>none</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>bochs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ramfb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </video>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='mode'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>subsystem</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>mandatory</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>requisite</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>optional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pci</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hostdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <rng supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>random</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </rng>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='driverType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>path</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>handle</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </filesystem>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emulator</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>external</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>2.0</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </tpm>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </redirdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <channel supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pty</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>unix</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </channel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>qemu</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </crypto>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <interface supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>passt</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </interface>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <panic supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>isa</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>hyperv</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </panic>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <gic supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sev supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='features'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>relaxed</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vapic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vpindex</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>runtime</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>synic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>stimer</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reset</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>frequencies</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ipi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>avic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hyperv>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </features>
Sep 30 21:09:22 compute-0 nova_compute[191871]: </domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.804 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 21:09:22 compute-0 nova_compute[191871]: <domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <arch>i686</arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <vcpu max='240'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <os supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <loader supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>rom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pflash</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='readonly'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>yes</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='secure'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </loader>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </os>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>file</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>anonymous</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>memfd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </memoryBacking>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <disk supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>disk</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cdrom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>floppy</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>lun</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ide</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>fdc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>sata</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </disk>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vnc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>dbus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </graphics>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <video supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='modelType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vga</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cirrus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>none</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>bochs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ramfb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </video>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='mode'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>subsystem</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>mandatory</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>requisite</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>optional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pci</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hostdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <rng supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>random</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </rng>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='driverType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>path</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>handle</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </filesystem>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emulator</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>external</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>2.0</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </tpm>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </redirdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <channel supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pty</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>unix</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </channel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>qemu</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </crypto>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <interface supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>passt</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </interface>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <panic supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>isa</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>hyperv</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </panic>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <gic supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sev supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='features'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>relaxed</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vapic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vpindex</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>runtime</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>synic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>stimer</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reset</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>frequencies</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ipi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>avic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hyperv>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </features>
Sep 30 21:09:22 compute-0 nova_compute[191871]: </domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.830 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.834 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 21:09:22 compute-0 nova_compute[191871]: <domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <arch>x86_64</arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <vcpu max='4096'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <os supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='firmware'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>efi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <loader supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>rom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pflash</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='readonly'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>yes</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='secure'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>yes</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </loader>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </os>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='athlon-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='core2duo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='coreduo-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='n270-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='phenom-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <memoryBacking supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='sourceType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>file</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>anonymous</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>memfd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </memoryBacking>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <disk supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='diskDevice'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>disk</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cdrom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>floppy</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>lun</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>fdc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>sata</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </disk>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <graphics supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vnc</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egl-headless</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>dbus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </graphics>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <video supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='modelType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vga</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>cirrus</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>none</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>bochs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ramfb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </video>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hostdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='mode'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>subsystem</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='startupPolicy'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>mandatory</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>requisite</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>optional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='subsysType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pci</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='capsType'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='pciBackend'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hostdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <rng supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>random</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>egd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </rng>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <filesystem supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='driverType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>path</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>handle</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>virtiofs</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </filesystem>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <tpm supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-tis</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tpm-crb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emulator</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>external</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendVersion'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>2.0</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </tpm>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <redirdev supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </redirdev>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <channel supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pty</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>unix</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </channel>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <crypto supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>qemu</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </crypto>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <interface supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='backendType'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>passt</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </interface>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <panic supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>isa</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>hyperv</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </panic>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </devices>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <features>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <gic supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <genid supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <backup supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <async-teardown supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <ps2 supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sev supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <sgx supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <hyperv supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='features'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>relaxed</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vapic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>spinlocks</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vpindex</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>runtime</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>synic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>stimer</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reset</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>vendor_id</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>frequencies</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>reenlightenment</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>tlbflush</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>ipi</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>avic</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>emsr_bitmap</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>xmm_input</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </hyperv>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <launchSecurity supported='no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </features>
Sep 30 21:09:22 compute-0 nova_compute[191871]: </domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:22 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.891 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 21:09:22 compute-0 nova_compute[191871]: <domainCapabilities>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <domain>kvm</domain>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <arch>x86_64</arch>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <vcpu max='240'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <iothreads supported='yes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <os supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <enum name='firmware'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <loader supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>rom</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>pflash</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='readonly'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>yes</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='secure'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>no</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </loader>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   </os>
Sep 30 21:09:22 compute-0 nova_compute[191871]:   <cpu>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <enum name='maximumMigratable'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>on</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <value>off</value>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <vendor>AMD</vendor>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='succor'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:22 compute-0 nova_compute[191871]:     <mode name='custom' supported='yes'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Denverton-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='auto-ibrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amd-psfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='stibp-always-on'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='EPYC-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-128'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-256'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx10-512'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='prefetchiti'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Haswell-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512er'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512pf'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fma4'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tbm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xop'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:22 compute-0 nova_compute[191871]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-int8'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='amx-tile'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-bf16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-fp16'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bitalg'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512ifma'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='avx512vnni'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:22 compute-0 nova_compute[191871]:         <feature name='fsrc'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fzrm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='la57'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='taa-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xfd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='SierraForest'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-ifma'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-vnni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cmpccxadd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fbsdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='fsrs'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ibrs-all'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='mcdt-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pbrsb-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='psdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='serialize'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vaes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='hle'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='rtm'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512bw'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512cd'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512dq'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512f'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='avx512vl'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='invpcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pcid'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='pku'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Snowridge'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='mpx'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='core-capability'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='split-lock-detect'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='cldemote'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='erms'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='gfni'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdir64b'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='movdiri'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='xsaves'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='athlon'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='athlon-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='core2duo'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='core2duo-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='coreduo'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='coreduo-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='n270'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='n270-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='ss'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='phenom'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <blockers model='phenom-v1'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnow'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <feature name='3dnowext'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </blockers>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </mode>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   </cpu>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   <memoryBacking supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <enum name='sourceType'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <value>file</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <value>anonymous</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <value>memfd</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   </memoryBacking>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   <devices>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <disk supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='diskDevice'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>disk</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>cdrom</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>floppy</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>lun</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>ide</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>fdc</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>sata</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </disk>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <graphics supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>vnc</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>egl-headless</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>dbus</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </graphics>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <video supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='modelType'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>vga</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>cirrus</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>none</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>bochs</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>ramfb</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </video>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <hostdev supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='mode'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>subsystem</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='startupPolicy'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>mandatory</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>requisite</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>optional</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='subsysType'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>pci</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>scsi</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='capsType'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='pciBackend'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </hostdev>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <rng supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio-transitional</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtio-non-transitional</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>random</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>egd</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </rng>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <filesystem supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='driverType'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>path</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>handle</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>virtiofs</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </filesystem>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <tpm supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>tpm-tis</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>tpm-crb</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>emulator</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>external</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='backendVersion'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>2.0</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </tpm>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <redirdev supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='bus'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>usb</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </redirdev>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <channel supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>pty</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>unix</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </channel>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <crypto supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='model'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='type'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>qemu</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='backendModel'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>builtin</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </crypto>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <interface supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='backendType'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>default</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>passt</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </interface>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <panic supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='model'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>isa</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>hyperv</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </panic>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   </devices>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   <features>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <gic supported='no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <genid supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <backup supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <async-teardown supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <ps2 supported='yes'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <sev supported='no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <sgx supported='no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <hyperv supported='yes'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       <enum name='features'>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>relaxed</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>vapic</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>spinlocks</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>vpindex</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>runtime</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>synic</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>stimer</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>reset</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>vendor_id</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>frequencies</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>reenlightenment</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>tlbflush</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>ipi</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>avic</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>emsr_bitmap</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:         <value>xmm_input</value>
Sep 30 21:09:23 compute-0 nova_compute[191871]:       </enum>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     </hyperv>
Sep 30 21:09:23 compute-0 nova_compute[191871]:     <launchSecurity supported='no'/>
Sep 30 21:09:23 compute-0 nova_compute[191871]:   </features>
Sep 30 21:09:23 compute-0 nova_compute[191871]: </domainCapabilities>
Sep 30 21:09:23 compute-0 nova_compute[191871]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.947 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.948 2 INFO nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Secure Boot support detected
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.950 2 INFO nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.950 2 INFO nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.963 2 DEBUG nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] cpu compare xml: <cpu match="exact">
Sep 30 21:09:23 compute-0 nova_compute[191871]:   <model>Nehalem</model>
Sep 30 21:09:23 compute-0 nova_compute[191871]: </cpu>
Sep 30 21:09:23 compute-0 nova_compute[191871]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:22.968 2 DEBUG nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.039 2 INFO nova.virt.node [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Determined node identity fe423b93-de5a-41f7-97d1-9622ea46af54 from /var/lib/nova/compute_id
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.054 2 WARNING nova.compute.manager [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Compute nodes ['fe423b93-de5a-41f7-97d1-9622ea46af54'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.086 2 INFO nova.compute.manager [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.161 2 WARNING nova.compute.manager [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.162 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.162 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.163 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.163 2 DEBUG nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:09:23 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 21:09:23 compute-0 sudo[192551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deaoczxvrsrxbsqlyrcgmtghthoahzcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266562.8896403-5194-161265950260467/AnsiballZ_podman_container.py'
Sep 30 21:09:23 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 21:09:23 compute-0 sudo[192551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:23 compute-0 python3.9[192571]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.508 2 WARNING nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.509 2 DEBUG nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6191MB free_disk=73.67498397827148GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.509 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.510 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.577 2 WARNING nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] No compute node record for compute-0.ctlplane.example.com:fe423b93-de5a-41f7-97d1-9622ea46af54: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host fe423b93-de5a-41f7-97d1-9622ea46af54 could not be found.
Sep 30 21:09:23 compute-0 sudo[192551]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.609 2 INFO nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: fe423b93-de5a-41f7-97d1-9622ea46af54
Sep 30 21:09:23 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.661 2 DEBUG nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:09:23 compute-0 nova_compute[191871]: 2025-09-30 21:09:23.662 2 DEBUG nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.229 2 INFO nova.scheduler.client.report [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] [req-2a734c8e-fc09-4fdc-b2bc-adeaa5ce1bbd] Created resource provider record via placement API for resource provider with UUID fe423b93-de5a-41f7-97d1-9622ea46af54 and name compute-0.ctlplane.example.com.
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.274 2 DEBUG nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 21:09:24 compute-0 nova_compute[191871]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.274 2 INFO nova.virt.libvirt.host [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] kernel doesn't support AMD SEV
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.275 2 DEBUG nova.compute.provider_tree [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.276 2 DEBUG nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.280 2 DEBUG nova.virt.libvirt.driver [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Libvirt baseline CPU <cpu>
Sep 30 21:09:24 compute-0 nova_compute[191871]:   <arch>x86_64</arch>
Sep 30 21:09:24 compute-0 nova_compute[191871]:   <model>Nehalem</model>
Sep 30 21:09:24 compute-0 nova_compute[191871]:   <vendor>AMD</vendor>
Sep 30 21:09:24 compute-0 nova_compute[191871]:   <topology sockets="8" cores="1" threads="1"/>
Sep 30 21:09:24 compute-0 nova_compute[191871]: </cpu>
Sep 30 21:09:24 compute-0 nova_compute[191871]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Sep 30 21:09:24 compute-0 sudo[192746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgykshpoztkeadxhwxoxndukblaocykh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266563.8737175-5218-264430134711864/AnsiballZ_systemd.py'
Sep 30 21:09:24 compute-0 sudo[192746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.359 2 DEBUG nova.scheduler.client.report [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Updated inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.360 2 DEBUG nova.compute.provider_tree [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Updating resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.360 2 DEBUG nova.compute.provider_tree [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.475 2 DEBUG nova.compute.provider_tree [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Updating resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:09:24 compute-0 python3.9[192748]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:09:24 compute-0 systemd[1]: Stopping nova_compute container...
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.683 2 DEBUG nova.compute.resource_tracker [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.683 2 DEBUG oslo_concurrency.lockutils [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.684 2 DEBUG nova.service [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.786 2 DEBUG nova.service [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.786 2 DEBUG nova.servicegroup.drivers.db [None req-da75228b-265d-41a0-b34e-a8506c1ff18a - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.795 2 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.797 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.797 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:24 compute-0 nova_compute[191871]: 2025-09-30 21:09:24.798 2 DEBUG oslo_concurrency.lockutils [None req-c863a7ec-605e-4144-b95b-fde02b5a8690 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:25 compute-0 virtqemud[192233]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 21:09:25 compute-0 virtqemud[192233]: hostname: compute-0
Sep 30 21:09:25 compute-0 virtqemud[192233]: End of file while reading data: Input/output error
Sep 30 21:09:25 compute-0 systemd[1]: libpod-765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810.scope: Deactivated successfully.
Sep 30 21:09:25 compute-0 systemd[1]: libpod-765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810.scope: Consumed 3.234s CPU time.
Sep 30 21:09:25 compute-0 podman[192752]: 2025-09-30 21:09:25.199331934 +0000 UTC m=+0.497445640 container died 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810-userdata-shm.mount: Deactivated successfully.
Sep 30 21:09:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760-merged.mount: Deactivated successfully.
Sep 30 21:09:25 compute-0 podman[192752]: 2025-09-30 21:09:25.264740976 +0000 UTC m=+0.562854682 container cleanup 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:09:25 compute-0 podman[192752]: nova_compute
Sep 30 21:09:25 compute-0 podman[192781]: nova_compute
Sep 30 21:09:25 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Sep 30 21:09:25 compute-0 systemd[1]: Stopped nova_compute container.
Sep 30 21:09:25 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 21:09:25 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ef75ad3dba48eea112675faeafa1a4fb9dd56617560b47a7d927c59aa6af760/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:25 compute-0 podman[192794]: 2025-09-30 21:09:25.469930548 +0000 UTC m=+0.098499053 container init 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Sep 30 21:09:25 compute-0 podman[192794]: 2025-09-30 21:09:25.478487025 +0000 UTC m=+0.107055430 container start 765bec2345aca3eba39d61c91d950df21dd8aef7456f8a6ac2a573d89806a810 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:09:25 compute-0 nova_compute[192810]: + sudo -E kolla_set_configs
Sep 30 21:09:25 compute-0 podman[192794]: nova_compute
Sep 30 21:09:25 compute-0 systemd[1]: Started nova_compute container.
Sep 30 21:09:25 compute-0 sudo[192746]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Validating config file
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying service configuration files
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /etc/ceph
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Creating directory /etc/ceph
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Writing out command to execute
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:25 compute-0 nova_compute[192810]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 21:09:25 compute-0 nova_compute[192810]: ++ cat /run_command
Sep 30 21:09:25 compute-0 nova_compute[192810]: + CMD=nova-compute
Sep 30 21:09:25 compute-0 nova_compute[192810]: + ARGS=
Sep 30 21:09:25 compute-0 nova_compute[192810]: + sudo kolla_copy_cacerts
Sep 30 21:09:25 compute-0 nova_compute[192810]: + [[ ! -n '' ]]
Sep 30 21:09:25 compute-0 nova_compute[192810]: + . kolla_extend_start
Sep 30 21:09:25 compute-0 nova_compute[192810]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 21:09:25 compute-0 nova_compute[192810]: Running command: 'nova-compute'
Sep 30 21:09:25 compute-0 nova_compute[192810]: + umask 0022
Sep 30 21:09:25 compute-0 nova_compute[192810]: + exec nova-compute
Sep 30 21:09:26 compute-0 sudo[192971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxwxoedvfbmljujjjjqnwqelxmobelu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266565.820277-5245-90812091336129/AnsiballZ_podman_container.py'
Sep 30 21:09:26 compute-0 sudo[192971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:26 compute-0 python3.9[192973]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 21:09:26 compute-0 systemd[1]: Started libpod-conmon-5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf.scope.
Sep 30 21:09:26 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/769544d04f90bb1b9de59a59f7201376fe05ed420b51870499f73af73e714857/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/769544d04f90bb1b9de59a59f7201376fe05ed420b51870499f73af73e714857/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/769544d04f90bb1b9de59a59f7201376fe05ed420b51870499f73af73e714857/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 21:09:26 compute-0 podman[192999]: 2025-09-30 21:09:26.607716273 +0000 UTC m=+0.103689969 container init 5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:26 compute-0 podman[192999]: 2025-09-30 21:09:26.619640111 +0000 UTC m=+0.115613787 container start 5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:09:26 compute-0 python3.9[192973]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Applying nova statedir ownership
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Sep 30 21:09:26 compute-0 nova_compute_init[193021]: INFO:nova_statedir:Nova statedir ownership complete
Sep 30 21:09:26 compute-0 systemd[1]: libpod-5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf.scope: Deactivated successfully.
Sep 30 21:09:26 compute-0 podman[193022]: 2025-09-30 21:09:26.703478609 +0000 UTC m=+0.047377567 container died 5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2)
Sep 30 21:09:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf-userdata-shm.mount: Deactivated successfully.
Sep 30 21:09:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-769544d04f90bb1b9de59a59f7201376fe05ed420b51870499f73af73e714857-merged.mount: Deactivated successfully.
Sep 30 21:09:26 compute-0 podman[193031]: 2025-09-30 21:09:26.748966759 +0000 UTC m=+0.057927202 container cleanup 5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:09:26 compute-0 systemd[1]: libpod-conmon-5c60e54aa935cb59abd40139e38e8f8638da47f4b8017ca0abadc00f15ad51bf.scope: Deactivated successfully.
Sep 30 21:09:26 compute-0 sudo[192971]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:27 compute-0 sshd-session[158337]: Connection closed by 192.168.122.30 port 49238
Sep 30 21:09:27 compute-0 sshd-session[158334]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:09:27 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Sep 30 21:09:27 compute-0 systemd[1]: session-24.scope: Consumed 2min 30.600s CPU time.
Sep 30 21:09:27 compute-0 systemd-logind[792]: Session 24 logged out. Waiting for processes to exit.
Sep 30 21:09:27 compute-0 systemd-logind[792]: Removed session 24.
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.611 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.612 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.612 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.612 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.752 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:09:27 compute-0 nova_compute[192810]: 2025-09-30 21:09:27.778 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.249 2 INFO nova.virt.driver [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.353 2 INFO nova.compute.provider_config [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.361 2 DEBUG oslo_concurrency.lockutils [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.361 2 DEBUG oslo_concurrency.lockutils [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.361 2 DEBUG oslo_concurrency.lockutils [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.362 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.363 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.364 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.365 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.366 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.366 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.366 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.366 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.366 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.367 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.368 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.369 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.370 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.371 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.372 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.373 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.374 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.375 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.376 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.377 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.378 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.379 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.380 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.381 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.382 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.383 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.384 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.385 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.386 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.387 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.388 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.389 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.390 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.391 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.392 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.393 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.394 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.395 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.396 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.397 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.398 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.399 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.400 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.401 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.402 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.403 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.404 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.405 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.406 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.407 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.408 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.409 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.410 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.411 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.412 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.413 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.414 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.415 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.416 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.417 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.418 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.419 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.420 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.421 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.422 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.423 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.424 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.424 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.424 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.424 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.424 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.425 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.426 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.427 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.428 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.429 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.430 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.430 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.430 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.430 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.430 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.431 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.432 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 WARNING oslo_config.cfg [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 21:09:28 compute-0 nova_compute[192810]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 21:09:28 compute-0 nova_compute[192810]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 21:09:28 compute-0 nova_compute[192810]: and ``live_migration_inbound_addr`` respectively.
Sep 30 21:09:28 compute-0 nova_compute[192810]: ).  Its value may be silently ignored in the future.
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.433 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.434 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.435 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.436 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.437 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.438 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.439 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.440 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.441 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.441 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.441 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.441 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.441 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.442 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.443 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.444 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.445 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.446 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.447 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.448 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.449 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.450 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.451 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.451 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.451 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.451 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.451 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.452 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.453 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.454 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.455 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.456 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.457 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.458 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.459 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.460 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.461 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.462 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.463 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.464 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.465 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.466 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.467 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.468 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.469 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.470 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.471 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.472 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.472 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.472 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.472 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.472 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.473 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.474 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.475 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.476 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.477 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.478 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.479 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.480 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.481 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.482 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.483 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.484 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.485 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.486 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.487 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.488 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.489 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.490 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.491 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.492 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.493 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.494 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.495 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.496 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.497 2 DEBUG oslo_service.service [None req-4a5fe724-f3b1-4e52-906f-749467c8a8e5 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.498 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.514 2 INFO nova.virt.node [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Determined node identity fe423b93-de5a-41f7-97d1-9622ea46af54 from /var/lib/nova/compute_id
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.515 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.515 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.516 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.516 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.532 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f25af41cb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.535 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f25af41cb20> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.535 2 INFO nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Connection event '1' reason 'None'
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.542 2 INFO nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]: 
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <host>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <uuid>5322d6ed-1aa0-443a-8ad6-efe9323295c5</uuid>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <arch>x86_64</arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <microcode version='16777317'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <signature family='23' model='49' stepping='0'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='x2apic'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='tsc-deadline'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='osxsave'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='hypervisor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='tsc_adjust'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='spec-ctrl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='stibp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='arch-capabilities'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='cmp_legacy'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='topoext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='virt-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='lbrv'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='tsc-scale'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='vmcb-clean'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='pause-filter'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='pfthreshold'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='svme-addr-chk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='rdctl-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='mds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature name='pschange-mc-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <pages unit='KiB' size='4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <pages unit='KiB' size='2048'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <pages unit='KiB' size='1048576'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <power_management>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <suspend_mem/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <suspend_disk/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <suspend_hybrid/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </power_management>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <iommu support='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <migration_features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <live/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <uri_transports>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <uri_transport>tcp</uri_transport>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <uri_transport>rdma</uri_transport>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </uri_transports>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </migration_features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <topology>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <cells num='1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <cell id='0'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <memory unit='KiB'>7864108</memory>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <pages unit='KiB' size='4'>1966027</pages>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <distances>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <sibling id='0' value='10'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           </distances>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           <cpus num='8'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:           </cpus>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         </cell>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </cells>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </topology>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <cache>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </cache>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <secmodel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model>selinux</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <doi>0</doi>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </secmodel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <secmodel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model>dac</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <doi>0</doi>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </secmodel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </host>
Sep 30 21:09:28 compute-0 nova_compute[192810]: 
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <guest>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <os_type>hvm</os_type>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <arch name='i686'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <wordsize>32</wordsize>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <domain type='qemu'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <domain type='kvm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <pae/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <nonpae/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <apic default='on' toggle='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <cpuselection/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <deviceboot/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <externalSnapshot/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </guest>
Sep 30 21:09:28 compute-0 nova_compute[192810]: 
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <guest>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <os_type>hvm</os_type>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <arch name='x86_64'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <wordsize>64</wordsize>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <domain type='qemu'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <domain type='kvm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <acpi default='on' toggle='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <apic default='on' toggle='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <cpuselection/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <deviceboot/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <disksnapshot default='on' toggle='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <externalSnapshot/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </guest>
Sep 30 21:09:28 compute-0 nova_compute[192810]: 
Sep 30 21:09:28 compute-0 nova_compute[192810]: </capabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]: 
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.547 2 DEBUG nova.virt.libvirt.volume.mount [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.551 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.557 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 21:09:28 compute-0 nova_compute[192810]: <domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <arch>i686</arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <vcpu max='240'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <os supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <loader supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>rom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pflash</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='readonly'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>yes</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='secure'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </loader>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </os>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>file</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>anonymous</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>memfd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </memoryBacking>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <disk supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>disk</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cdrom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>floppy</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>lun</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ide</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>fdc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>sata</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vnc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>dbus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <video supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='modelType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vga</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cirrus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>none</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>bochs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ramfb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </video>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='mode'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>subsystem</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>mandatory</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>requisite</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>optional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pci</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hostdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <rng supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>random</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='driverType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>path</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>handle</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </filesystem>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emulator</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>external</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>2.0</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </tpm>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </redirdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <channel supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pty</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>unix</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </channel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>qemu</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </crypto>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <interface supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>passt</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <panic supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>isa</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>hyperv</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </panic>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <gic supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sev supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='features'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>relaxed</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vapic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vpindex</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>runtime</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>synic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>stimer</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reset</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>frequencies</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ipi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>avic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hyperv>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]: </domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.564 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 21:09:28 compute-0 nova_compute[192810]: <domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <arch>i686</arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <vcpu max='4096'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <os supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <loader supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>rom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pflash</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='readonly'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>yes</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='secure'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </loader>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </os>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>file</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>anonymous</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>memfd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </memoryBacking>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <disk supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>disk</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cdrom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>floppy</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>lun</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>fdc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>sata</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vnc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>dbus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <video supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='modelType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vga</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cirrus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>none</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>bochs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ramfb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </video>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='mode'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>subsystem</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>mandatory</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>requisite</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>optional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pci</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hostdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <rng supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>random</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='driverType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>path</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>handle</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </filesystem>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emulator</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>external</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>2.0</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </tpm>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </redirdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <channel supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pty</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>unix</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </channel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>qemu</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </crypto>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <interface supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>passt</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <panic supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>isa</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>hyperv</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </panic>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <gic supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sev supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='features'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>relaxed</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vapic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vpindex</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>runtime</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>synic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>stimer</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reset</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>frequencies</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ipi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>avic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hyperv>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]: </domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.603 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.609 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 21:09:28 compute-0 nova_compute[192810]: <domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <arch>x86_64</arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <vcpu max='240'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <os supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='firmware'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <loader supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>rom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pflash</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='readonly'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>yes</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='secure'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </loader>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </os>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>file</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>anonymous</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>memfd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </memoryBacking>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <disk supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>disk</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cdrom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>floppy</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>lun</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ide</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>fdc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>sata</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vnc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>dbus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <video supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='modelType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vga</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cirrus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>none</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>bochs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ramfb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </video>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='mode'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>subsystem</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>mandatory</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>requisite</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>optional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pci</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hostdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <rng supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>random</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='driverType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>path</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>handle</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </filesystem>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emulator</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>external</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>2.0</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </tpm>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </redirdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <channel supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pty</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>unix</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </channel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>qemu</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </crypto>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <interface supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>passt</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <panic supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>isa</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>hyperv</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </panic>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <gic supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sev supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='features'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>relaxed</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vapic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vpindex</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>runtime</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>synic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>stimer</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reset</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>frequencies</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ipi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>avic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hyperv>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]: </domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.669 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 21:09:28 compute-0 nova_compute[192810]: <domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <domain>kvm</domain>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <arch>x86_64</arch>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <vcpu max='4096'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <iothreads supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <os supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='firmware'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>efi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <loader supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>rom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pflash</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='readonly'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>yes</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='secure'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>yes</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>no</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </loader>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </os>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-passthrough' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='hostPassthroughMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='maximum' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='maximumMigratable'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>on</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>off</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='host-model' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <vendor>AMD</vendor>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='x2apic'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='hypervisor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='stibp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='overflow-recov'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='succor'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lbrv'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='tsc-scale'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='flushbyasid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pause-filter'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pfthreshold'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rdctl-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='mds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='gds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='require' name='rfds-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <feature policy='disable' name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <mode name='custom' supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Broadwell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Cooperlake-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Denverton-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Dhyana-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='auto-ibrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Milan-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amd-psfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='no-nested-data-bp'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='null-sel-clr-base'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='stibp-always-on'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-Rome-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='EPYC-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='GraniteRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-128'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-256'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx10-512'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='prefetchiti'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Haswell-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v6'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Icelake-Server-v7'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='IvyBridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='KnightsMill-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4fmaps'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-4vnniw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512er'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512pf'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G4-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Opteron_G5-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fma4'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tbm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xop'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SapphireRapids-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='amx-tile'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-bf16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-fp16'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512-vpopcntdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bitalg'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vbmi2'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrc'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fzrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='la57'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='taa-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='tsx-ldtrk'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xfd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='SierraForest-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ifma'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-ne-convert'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx-vnni-int8'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='bus-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cmpccxadd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fbsdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='fsrs'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ibrs-all'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mcdt-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pbrsb-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='psdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='sbdr-ssdp-no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='serialize'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vaes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='vpclmulqdq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Client-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='hle'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='rtm'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Skylake-Server-v5'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512bw'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512cd'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512dq'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512f'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='avx512vl'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='invpcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pcid'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='pku'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='mpx'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v2'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v3'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='core-capability'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='split-lock-detect'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='Snowridge-v4'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='cldemote'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='erms'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='gfni'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdir64b'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='movdiri'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='xsaves'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='athlon-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='core2duo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='coreduo-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='n270-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='ss'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <blockers model='phenom-v1'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnow'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <feature name='3dnowext'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </blockers>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </mode>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <memoryBacking supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <enum name='sourceType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>file</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>anonymous</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <value>memfd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </memoryBacking>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <disk supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='diskDevice'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>disk</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cdrom</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>floppy</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>lun</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>fdc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>sata</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <graphics supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vnc</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egl-headless</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>dbus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <video supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='modelType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vga</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>cirrus</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>none</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>bochs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ramfb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </video>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hostdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='mode'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>subsystem</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='startupPolicy'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>mandatory</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>requisite</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>optional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='subsysType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pci</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>scsi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='capsType'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='pciBackend'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hostdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <rng supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtio-non-transitional</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>random</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>egd</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <filesystem supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='driverType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>path</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>handle</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>virtiofs</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </filesystem>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <tpm supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-tis</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tpm-crb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emulator</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>external</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendVersion'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>2.0</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </tpm>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <redirdev supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='bus'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>usb</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </redirdev>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <channel supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>pty</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>unix</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </channel>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <crypto supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='type'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>qemu</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendModel'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>builtin</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </crypto>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <interface supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='backendType'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>default</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>passt</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <panic supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='model'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>isa</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>hyperv</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </panic>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <features>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <gic supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <vmcoreinfo supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <genid supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backingStoreInput supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <backup supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <async-teardown supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <ps2 supported='yes'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sev supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <sgx supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <hyperv supported='yes'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       <enum name='features'>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>relaxed</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vapic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>spinlocks</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vpindex</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>runtime</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>synic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>stimer</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reset</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>vendor_id</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>frequencies</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>reenlightenment</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>tlbflush</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>ipi</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>avic</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>emsr_bitmap</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:         <value>xmm_input</value>
Sep 30 21:09:28 compute-0 nova_compute[192810]:       </enum>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     </hyperv>
Sep 30 21:09:28 compute-0 nova_compute[192810]:     <launchSecurity supported='no'/>
Sep 30 21:09:28 compute-0 nova_compute[192810]:   </features>
Sep 30 21:09:28 compute-0 nova_compute[192810]: </domainCapabilities>
Sep 30 21:09:28 compute-0 nova_compute[192810]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.726 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.727 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.727 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.727 2 INFO nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Secure Boot support detected
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.729 2 INFO nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.729 2 INFO nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.740 2 DEBUG nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] cpu compare xml: <cpu match="exact">
Sep 30 21:09:28 compute-0 nova_compute[192810]:   <model>Nehalem</model>
Sep 30 21:09:28 compute-0 nova_compute[192810]: </cpu>
Sep 30 21:09:28 compute-0 nova_compute[192810]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.742 2 DEBUG nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.780 2 INFO nova.virt.node [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Determined node identity fe423b93-de5a-41f7-97d1-9622ea46af54 from /var/lib/nova/compute_id
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.816 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Verified node fe423b93-de5a-41f7-97d1-9622ea46af54 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.842 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.904 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.905 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.905 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:28 compute-0 nova_compute[192810]: 2025-09-30 21:09:28.905 2 DEBUG nova.compute.resource_tracker [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.074 2 WARNING nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.075 2 DEBUG nova.compute.resource_tracker [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6206MB free_disk=73.67367172241211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.075 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.075 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.240 2 DEBUG nova.compute.resource_tracker [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.240 2 DEBUG nova.compute.resource_tracker [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.253 2 DEBUG nova.scheduler.client.report [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.268 2 DEBUG nova.scheduler.client.report [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.268 2 DEBUG nova.compute.provider_tree [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.280 2 DEBUG nova.scheduler.client.report [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.301 2 DEBUG nova.scheduler.client.report [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.320 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 21:09:29 compute-0 nova_compute[192810]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.321 2 INFO nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] kernel doesn't support AMD SEV
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.322 2 DEBUG nova.compute.provider_tree [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.322 2 DEBUG nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.325 2 DEBUG nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Libvirt baseline CPU <cpu>
Sep 30 21:09:29 compute-0 nova_compute[192810]:   <arch>x86_64</arch>
Sep 30 21:09:29 compute-0 nova_compute[192810]:   <model>Nehalem</model>
Sep 30 21:09:29 compute-0 nova_compute[192810]:   <vendor>AMD</vendor>
Sep 30 21:09:29 compute-0 nova_compute[192810]:   <topology sockets="8" cores="1" threads="1"/>
Sep 30 21:09:29 compute-0 nova_compute[192810]: </cpu>
Sep 30 21:09:29 compute-0 nova_compute[192810]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.346 2 DEBUG nova.scheduler.client.report [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.378 2 DEBUG nova.compute.resource_tracker [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.378 2 DEBUG oslo_concurrency.lockutils [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.379 2 DEBUG nova.service [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.414 2 DEBUG nova.service [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.415 2 DEBUG nova.servicegroup.drivers.db [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.416 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:09:29 compute-0 nova_compute[192810]: 2025-09-30 21:09:29.435 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:09:31 compute-0 podman[193112]: 2025-09-30 21:09:31.369955395 +0000 UTC m=+0.088542363 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 21:09:33 compute-0 sshd-session[193132]: Accepted publickey for zuul from 192.168.122.30 port 54078 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:09:33 compute-0 systemd-logind[792]: New session 27 of user zuul.
Sep 30 21:09:33 compute-0 systemd[1]: Started Session 27 of User zuul.
Sep 30 21:09:33 compute-0 sshd-session[193132]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:09:34 compute-0 python3.9[193285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 21:09:35 compute-0 podman[193290]: 2025-09-30 21:09:35.345300738 +0000 UTC m=+0.079515764 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:09:36 compute-0 sudo[193459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mowfflxnuehavwgrlusgpbyixrubakja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266575.7561533-73-144138778598802/AnsiballZ_systemd_service.py'
Sep 30 21:09:36 compute-0 sudo[193459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:36 compute-0 python3.9[193461]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:36 compute-0 systemd[1]: Reloading.
Sep 30 21:09:36 compute-0 systemd-rc-local-generator[193490]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:36 compute-0 systemd-sysv-generator[193493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:37 compute-0 sudo[193459]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:38 compute-0 python3.9[193647]: ansible-ansible.builtin.service_facts Invoked
Sep 30 21:09:38 compute-0 network[193664]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 21:09:38 compute-0 network[193665]: 'network-scripts' will be removed from distribution in near future.
Sep 30 21:09:38 compute-0 network[193666]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 21:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:09:38.711 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:09:38.712 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:09:38.713 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:09:40 compute-0 podman[193720]: 2025-09-30 21:09:40.149635461 +0000 UTC m=+0.086246659 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:09:44 compute-0 sudo[193966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jauhptdnjsckpwqmijiidacrfyqcnprd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266583.6375701-130-269528409412535/AnsiballZ_systemd_service.py'
Sep 30 21:09:44 compute-0 sudo[193966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:44 compute-0 python3.9[193968]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:09:44 compute-0 sudo[193966]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:45 compute-0 podman[194070]: 2025-09-30 21:09:45.336821727 +0000 UTC m=+0.063500505 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:09:45 compute-0 sudo[194140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huqugfjpmobpjdiundyhirjkbdqkamtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266584.830725-160-228273724153974/AnsiballZ_file.py'
Sep 30 21:09:45 compute-0 sudo[194140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:45 compute-0 python3.9[194142]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:45 compute-0 sudo[194140]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:45 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:45 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:09:45 compute-0 unix_chkpwd[194196]: password check failed for user (root)
Sep 30 21:09:45 compute-0 sshd-session[194121]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:09:46 compute-0 sudo[194294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sedcxffwrjbzwdanwugdgmynlalxbjva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266585.813572-184-154232841020942/AnsiballZ_file.py'
Sep 30 21:09:46 compute-0 sudo[194294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:46 compute-0 python3.9[194296]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:09:46 compute-0 sudo[194294]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:47 compute-0 sudo[194446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jueylsakqevydelhwfpswfzjyuzvqemq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266586.722479-211-236750347333108/AnsiballZ_command.py'
Sep 30 21:09:47 compute-0 sudo[194446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:47 compute-0 sshd-session[194121]: Failed password for root from 45.81.23.80 port 60710 ssh2
Sep 30 21:09:47 compute-0 python3.9[194448]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:09:47 compute-0 sudo[194446]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:47 compute-0 sshd-session[194121]: Received disconnect from 45.81.23.80 port 60710:11: Bye Bye [preauth]
Sep 30 21:09:47 compute-0 sshd-session[194121]: Disconnected from authenticating user root 45.81.23.80 port 60710 [preauth]
Sep 30 21:09:48 compute-0 python3.9[194600]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:09:49 compute-0 sudo[194750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aymgbeweflliyxserzmaiyjyqqgkgxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266588.6917317-265-176993005702720/AnsiballZ_systemd_service.py'
Sep 30 21:09:49 compute-0 sudo[194750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:49 compute-0 python3.9[194752]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:09:49 compute-0 systemd[1]: Reloading.
Sep 30 21:09:49 compute-0 systemd-rc-local-generator[194780]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:09:49 compute-0 systemd-sysv-generator[194783]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:09:49 compute-0 sudo[194750]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:50 compute-0 sudo[194937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpyqipaqejpnownebjcarpnmeryjifem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266589.9920557-289-62283253879549/AnsiballZ_command.py'
Sep 30 21:09:50 compute-0 sudo[194937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:50 compute-0 python3.9[194939]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:09:50 compute-0 sudo[194937]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:51 compute-0 sudo[195090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqdourfpyoqostysjrzmlsjbwmfwalbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266590.9479034-316-57901004471359/AnsiballZ_file.py'
Sep 30 21:09:51 compute-0 sudo[195090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:51 compute-0 python3.9[195092]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:51 compute-0 sudo[195090]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:52 compute-0 python3.9[195242]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:09:53 compute-0 python3.9[195394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:09:53 compute-0 python3.9[195515]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266592.6523912-364-126828596424876/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:09:54 compute-0 sudo[195665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dljkqcfzbxvbyaydtwcezjvpaldafjdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266594.1946797-409-246452631308992/AnsiballZ_group.py'
Sep 30 21:09:54 compute-0 sudo[195665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:54 compute-0 python3.9[195667]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Sep 30 21:09:54 compute-0 sudo[195665]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:55 compute-0 sudo[195817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsnaracrkdszhkanpjbcwffknreasaou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266595.3867953-442-77539625205796/AnsiballZ_getent.py'
Sep 30 21:09:55 compute-0 sudo[195817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:56 compute-0 python3.9[195819]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Sep 30 21:09:56 compute-0 sudo[195817]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:56 compute-0 sudo[195970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgqgejftgtzaxniyeucndlsbxlssaayu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266596.3097546-466-76240903539662/AnsiballZ_group.py'
Sep 30 21:09:56 compute-0 sudo[195970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:56 compute-0 python3.9[195972]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 21:09:56 compute-0 groupadd[195973]: group added to /etc/group: name=ceilometer, GID=42405
Sep 30 21:09:56 compute-0 groupadd[195973]: group added to /etc/gshadow: name=ceilometer
Sep 30 21:09:56 compute-0 groupadd[195973]: new group: name=ceilometer, GID=42405
Sep 30 21:09:56 compute-0 sudo[195970]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:57 compute-0 sudo[196128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjapnvcmlxikpwpasskvhbdvoywjinmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266597.2014444-490-67355561470256/AnsiballZ_user.py'
Sep 30 21:09:57 compute-0 sudo[196128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:09:57 compute-0 python3.9[196130]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 21:09:58 compute-0 useradd[196132]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 21:09:58 compute-0 useradd[196132]: add 'ceilometer' to group 'libvirt'
Sep 30 21:09:58 compute-0 useradd[196132]: add 'ceilometer' to shadow group 'libvirt'
Sep 30 21:09:58 compute-0 sudo[196128]: pam_unix(sudo:session): session closed for user root
Sep 30 21:09:59 compute-0 python3.9[196288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:00 compute-0 python3.9[196409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266599.0270023-568-12453336577580/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:00 compute-0 python3.9[196559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:01 compute-0 python3.9[196680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266600.4517903-568-35338348263277/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:01 compute-0 podman[196681]: 2025-09-30 21:10:01.649838548 +0000 UTC m=+0.081616104 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 21:10:02 compute-0 python3.9[196850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:02 compute-0 python3.9[196971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759266601.7630382-568-195663495661521/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:03 compute-0 python3.9[197121]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:04 compute-0 python3.9[197273]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:05 compute-0 python3.9[197425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:05 compute-0 podman[197520]: 2025-09-30 21:10:05.75114604 +0000 UTC m=+0.077001319 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd)
Sep 30 21:10:05 compute-0 python3.9[197558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266604.7963483-745-155788279259727/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:06 compute-0 python3.9[197716]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:07 compute-0 python3.9[197792]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:07 compute-0 python3.9[197942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:08 compute-0 python3.9[198063]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266607.3787265-745-250877568987509/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:09 compute-0 python3.9[198213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:09 compute-0 python3.9[198334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266608.7991061-745-77083744711336/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:10 compute-0 podman[198458]: 2025-09-30 21:10:10.386681583 +0000 UTC m=+0.108650524 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:10:10 compute-0 python3.9[198497]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:11 compute-0 python3.9[198631]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266610.0158556-745-44209469905706/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:11 compute-0 python3.9[198781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:12 compute-0 python3.9[198902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266611.2618115-745-227957678431167/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:13 compute-0 python3.9[199052]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:13 compute-0 python3.9[199173]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266612.4996836-745-121614260242546/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:14 compute-0 python3.9[199323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:15 compute-0 python3.9[199444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266613.790333-745-83982654770953/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:15 compute-0 podman[199568]: 2025-09-30 21:10:15.720628586 +0000 UTC m=+0.072798156 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:10:15 compute-0 python3.9[199610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:16 compute-0 python3.9[199734]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266615.3404064-745-49439252700543/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:17 compute-0 python3.9[199884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:17 compute-0 python3.9[200005]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266616.778258-745-84619881461885/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:18 compute-0 python3.9[200155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:19 compute-0 python3.9[200276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266618.0846996-745-179537499890151/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:19 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 182064
Sep 30 21:10:20 compute-0 python3.9[200426]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:21 compute-0 python3.9[200502]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:22 compute-0 python3.9[200652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:22 compute-0 python3.9[200728]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:23 compute-0 python3.9[200878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:24 compute-0 python3.9[200954]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:24 compute-0 sudo[201104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwlcfystbcqzxymoefzfnjuoviikraca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266624.365179-1312-83512344055637/AnsiballZ_file.py'
Sep 30 21:10:24 compute-0 sudo[201104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:24 compute-0 python3.9[201106]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:24 compute-0 sudo[201104]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:25 compute-0 sudo[201256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jinxlcywnndvidbsgtbawipjzziofjxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266625.165705-1336-97544345450348/AnsiballZ_file.py'
Sep 30 21:10:25 compute-0 sudo[201256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:25 compute-0 python3.9[201258]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:25 compute-0 sudo[201256]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:26 compute-0 sudo[201408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfdsbkfbrtrqimsfgaseoyjxgqezxcfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266625.9813898-1360-29463408873884/AnsiballZ_file.py'
Sep 30 21:10:26 compute-0 sudo[201408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:26 compute-0 python3.9[201410]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:26 compute-0 sudo[201408]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:27 compute-0 sudo[201560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtbdxpowncuehdfhnoqpalrimsdgaucs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266626.8143528-1384-130641166819717/AnsiballZ_systemd_service.py'
Sep 30 21:10:27 compute-0 sudo[201560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:27 compute-0 python3.9[201562]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:27 compute-0 systemd[1]: Reloading.
Sep 30 21:10:27 compute-0 systemd-rc-local-generator[201591]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:27 compute-0 systemd-sysv-generator[201595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.790 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.791 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.791 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.792 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.812 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.813 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.813 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.814 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.814 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.814 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.815 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.815 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.815 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.847 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.847 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.848 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:27 compute-0 nova_compute[192810]: 2025-09-30 21:10:27.848 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:10:27 compute-0 systemd[1]: Listening on Podman API Socket.
Sep 30 21:10:27 compute-0 sudo[201560]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.020 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.021 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6166MB free_disk=73.67305755615234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.022 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.022 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.120 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.121 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.156 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.177 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.178 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:10:28 compute-0 nova_compute[192810]: 2025-09-30 21:10:28.179 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:29 compute-0 sudo[201751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxetnwbutfhbvuiwhktxwxygvvnvtngq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/AnsiballZ_stat.py'
Sep 30 21:10:29 compute-0 sudo[201751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:29 compute-0 python3.9[201753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:29 compute-0 sudo[201751]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:30 compute-0 sudo[201874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmguvhwplipnniphoxlpqlaokjaujwqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/AnsiballZ_copy.py'
Sep 30 21:10:30 compute-0 sudo[201874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:30 compute-0 python3.9[201876]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:30 compute-0 sudo[201874]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:30 compute-0 sudo[201950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkvlhhvnrgkvlgdmzijvzaaocnhidhvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/AnsiballZ_stat.py'
Sep 30 21:10:30 compute-0 sudo[201950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:31 compute-0 python3.9[201952]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:31 compute-0 sudo[201950]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:31 compute-0 sudo[202073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjpauvdozcahnqwnjbcqtlujdbbsbzny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/AnsiballZ_copy.py'
Sep 30 21:10:31 compute-0 sudo[202073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:31 compute-0 python3.9[202075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266629.43982-1411-94237129487599/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:31 compute-0 sudo[202073]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:31 compute-0 podman[202076]: 2025-09-30 21:10:31.869700754 +0000 UTC m=+0.062979342 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid)
Sep 30 21:10:32 compute-0 sudo[202245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdpzpbszopkiatppgvsprolpglilsown ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266632.3604078-1495-196522194778000/AnsiballZ_container_config_data.py'
Sep 30 21:10:32 compute-0 sudo[202245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:33 compute-0 python3.9[202247]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Sep 30 21:10:33 compute-0 sudo[202245]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:33 compute-0 sudo[202397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwxwtkufbdvtvjzmvjqztngczkmycybf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266633.423428-1522-272316033011977/AnsiballZ_container_config_hash.py'
Sep 30 21:10:33 compute-0 sudo[202397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:34 compute-0 python3.9[202399]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:34 compute-0 sudo[202397]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:35 compute-0 sudo[202549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhpvawbhyjxuwphgdsyjssmjkcdhlpvw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266634.5193598-1552-122141872128674/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:35 compute-0 sudo[202549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:35 compute-0 python3[202551]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:10:35 compute-0 podman[202589]: 2025-09-30 21:10:35.527022251 +0000 UTC m=+0.045500138 container create f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 21:10:35 compute-0 podman[202589]: 2025-09-30 21:10:35.502666078 +0000 UTC m=+0.021143985 image pull c1fbb3a9fe801a81492a24a592ec5927cb36487bb102738c2047084bd3d79886 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Sep 30 21:10:35 compute-0 python3[202551]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Sep 30 21:10:35 compute-0 sudo[202549]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:36 compute-0 sudo[202790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiuackdbsepsflzrmfwgzbjpaadsrwea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266635.99684-1576-198796712386170/AnsiballZ_stat.py'
Sep 30 21:10:36 compute-0 podman[202748]: 2025-09-30 21:10:36.370993849 +0000 UTC m=+0.086840284 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:10:36 compute-0 sudo[202790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:36 compute-0 python3.9[202800]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:36 compute-0 sudo[202790]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:37 compute-0 sudo[202952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wddztiopxspryqowmmqajilfbwnzcyxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.0026293-1603-14793237516962/AnsiballZ_file.py'
Sep 30 21:10:37 compute-0 sudo[202952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:37 compute-0 python3.9[202954]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:37 compute-0 sudo[202952]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:38 compute-0 sudo[203103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tceozggiennhogewvwjxmiifzuhqjfdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.6733668-1603-124526263664990/AnsiballZ_copy.py'
Sep 30 21:10:38 compute-0 sudo[203103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:38 compute-0 python3.9[203105]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266637.6733668-1603-124526263664990/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:38 compute-0 sudo[203103]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:10:38.712 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:10:38.713 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:10:38.713 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:10:39 compute-0 sudo[203179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzshlnxjdtwgksuizcbqsqkmdhoosul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.6733668-1603-124526263664990/AnsiballZ_systemd.py'
Sep 30 21:10:39 compute-0 sudo[203179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:39 compute-0 python3.9[203181]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:10:39 compute-0 systemd[1]: Reloading.
Sep 30 21:10:39 compute-0 systemd-rc-local-generator[203206]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:39 compute-0 systemd-sysv-generator[203212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:39 compute-0 sudo[203179]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:40 compute-0 sudo[203291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvpdaoohzqfzlniqbgkzkwpiaigsvhqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266637.6733668-1603-124526263664990/AnsiballZ_systemd.py'
Sep 30 21:10:40 compute-0 sudo[203291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:40 compute-0 python3.9[203293]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:40 compute-0 systemd[1]: Reloading.
Sep 30 21:10:40 compute-0 systemd-rc-local-generator[203347]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:40 compute-0 podman[203295]: 2025-09-30 21:10:40.641808132 +0000 UTC m=+0.128904266 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Sep 30 21:10:40 compute-0 systemd-sysv-generator[203350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:40 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Sep 30 21:10:40 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:10:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.
Sep 30 21:10:41 compute-0 podman[203357]: 2025-09-30 21:10:41.03148702 +0000 UTC m=+0.155723500 container init f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + sudo -E kolla_set_configs
Sep 30 21:10:41 compute-0 podman[203357]: 2025-09-30 21:10:41.064657213 +0000 UTC m=+0.188893663 container start f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:10:41 compute-0 podman[203357]: ceilometer_agent_compute
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:41 compute-0 sudo[203378]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:10:41 compute-0 sudo[203378]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:41 compute-0 sudo[203378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:41 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Sep 30 21:10:41 compute-0 sudo[203291]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Validating config file
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Copying service configuration files
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: INFO:__main__:Writing out command to execute
Sep 30 21:10:41 compute-0 sudo[203378]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: ++ cat /run_command
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + ARGS=
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + sudo kolla_copy_cacerts
Sep 30 21:10:41 compute-0 podman[203379]: 2025-09-30 21:10:41.17348501 +0000 UTC m=+0.089242603 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:41 compute-0 sudo[203403]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:10:41 compute-0 sudo[203403]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:41 compute-0 sudo[203403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:41 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-79acdf685856f64.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:10:41 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-79acdf685856f64.service: Failed with result 'exit-code'.
Sep 30 21:10:41 compute-0 sudo[203403]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + [[ ! -n '' ]]
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + . kolla_extend_start
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + umask 0022
Sep 30 21:10:41 compute-0 ceilometer_agent_compute[203372]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Sep 30 21:10:41 compute-0 sudo[203554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khssaccobqfmrsuvvagjrgwgvyzysmfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266641.342967-1675-196679088490328/AnsiballZ_systemd.py'
Sep 30 21:10:41 compute-0 sudo[203554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:41 compute-0 sshd-session[203479]: Invalid user support from 45.81.23.80 port 55560
Sep 30 21:10:41 compute-0 sshd-session[203479]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:10:41 compute-0 sshd-session[203479]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.003 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.003 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.003 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.003 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.003 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.004 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.005 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.006 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.007 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.008 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.009 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.010 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.011 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.012 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.013 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.014 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.015 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.016 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 python3.9[203556]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.017 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.018 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.019 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.019 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.019 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.019 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.019 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.041 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.043 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.045 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:42 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.155 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.161 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.234 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.235 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.236 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.237 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.238 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.239 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.240 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.241 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.242 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.243 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.244 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.245 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.246 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.247 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.248 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.249 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.250 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.251 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.252 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.253 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.254 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.254 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.256 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.257 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.257 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.261 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203372]: 2025-09-30 21:10:42.278 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Sep 30 21:10:42 compute-0 virtqemud[192233]: End of file while reading data: Input/output error
Sep 30 21:10:42 compute-0 systemd[1]: libpod-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope: Deactivated successfully.
Sep 30 21:10:42 compute-0 systemd[1]: libpod-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope: Consumed 1.393s CPU time.
Sep 30 21:10:42 compute-0 podman[203563]: 2025-09-30 21:10:42.444078717 +0000 UTC m=+0.339809888 container died f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm)
Sep 30 21:10:42 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-79acdf685856f64.timer: Deactivated successfully.
Sep 30 21:10:42 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.
Sep 30 21:10:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-userdata-shm.mount: Deactivated successfully.
Sep 30 21:10:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50-merged.mount: Deactivated successfully.
Sep 30 21:10:42 compute-0 podman[203563]: 2025-09-30 21:10:42.527752609 +0000 UTC m=+0.423483780 container cleanup f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:10:42 compute-0 podman[203563]: ceilometer_agent_compute
Sep 30 21:10:42 compute-0 podman[203591]: ceilometer_agent_compute
Sep 30 21:10:42 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Sep 30 21:10:42 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Sep 30 21:10:42 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Sep 30 21:10:42 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:10:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78df4dd46dbc703a9447a9aa008e13dd919441cf17e2c720c39fbad895d39f50/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.
Sep 30 21:10:42 compute-0 podman[203604]: 2025-09-30 21:10:42.769452061 +0000 UTC m=+0.134501960 container init f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + sudo -E kolla_set_configs
Sep 30 21:10:42 compute-0 podman[203604]: 2025-09-30 21:10:42.810045736 +0000 UTC m=+0.175095585 container start f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:42 compute-0 sudo[203625]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 21:10:42 compute-0 podman[203604]: ceilometer_agent_compute
Sep 30 21:10:42 compute-0 sudo[203625]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:42 compute-0 sudo[203625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:42 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Sep 30 21:10:42 compute-0 sudo[203554]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Validating config file
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Copying service configuration files
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: INFO:__main__:Writing out command to execute
Sep 30 21:10:42 compute-0 sudo[203625]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: ++ cat /run_command
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + ARGS=
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + sudo kolla_copy_cacerts
Sep 30 21:10:42 compute-0 podman[203626]: 2025-09-30 21:10:42.911274742 +0000 UTC m=+0.081055797 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:10:42 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:10:42 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Failed with result 'exit-code'.
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: sudo: unable to send audit message: Operation not permitted
Sep 30 21:10:42 compute-0 sudo[203647]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 21:10:42 compute-0 sudo[203647]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Sep 30 21:10:42 compute-0 sudo[203647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Sep 30 21:10:42 compute-0 sudo[203647]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + [[ ! -n '' ]]
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + . kolla_extend_start
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + umask 0022
Sep 30 21:10:42 compute-0 ceilometer_agent_compute[203619]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Sep 30 21:10:43 compute-0 sudo[203797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxtoodzdjaiwpixykiamlsrakayccvlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266643.10918-1699-105888126624040/AnsiballZ_stat.py'
Sep 30 21:10:43 compute-0 sudo[203797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:43 compute-0 python3.9[203799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:43 compute-0 sudo[203797]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.680 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.680 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.680 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.680 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.681 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.682 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.683 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.684 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.685 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.688 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.689 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.690 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.691 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.692 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.693 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.712 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.713 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.715 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.727 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:43 compute-0 sshd-session[203479]: Failed password for invalid user support from 45.81.23.80 port 55560 ssh2
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.871 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.871 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.871 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.871 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.872 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.873 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.874 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.875 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.876 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.877 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.878 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.879 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.880 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.881 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.882 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.883 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.884 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.885 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.886 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.887 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.888 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.889 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.890 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.891 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.894 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.899 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:10:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:10:44 compute-0 sudo[203926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilqczrflwijfnsqrvmimbjuzpwxleaje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266643.10918-1699-105888126624040/AnsiballZ_copy.py'
Sep 30 21:10:44 compute-0 sudo[203926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:44 compute-0 python3.9[203928]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266643.10918-1699-105888126624040/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:44 compute-0 sudo[203926]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:45 compute-0 sudo[204078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouxvupdoqsbbdyfhpyzoglsjlmygvrgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266644.820243-1750-199351204054626/AnsiballZ_container_config_data.py'
Sep 30 21:10:45 compute-0 sudo[204078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:45 compute-0 sshd-session[203479]: Received disconnect from 45.81.23.80 port 55560:11: Bye Bye [preauth]
Sep 30 21:10:45 compute-0 sshd-session[203479]: Disconnected from invalid user support 45.81.23.80 port 55560 [preauth]
Sep 30 21:10:45 compute-0 python3.9[204080]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Sep 30 21:10:45 compute-0 sudo[204078]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:46 compute-0 podman[204204]: 2025-09-30 21:10:46.166733015 +0000 UTC m=+0.062567200 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:10:46 compute-0 sudo[204247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taxqbkbopmmqdfxsgfjordktdylbonup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266645.8269937-1777-39784232017214/AnsiballZ_container_config_hash.py'
Sep 30 21:10:46 compute-0 sudo[204247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:46 compute-0 python3.9[204251]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:46 compute-0 sudo[204247]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:47 compute-0 sudo[204401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyllouijjpyoejfehetjdgrevxnhyunl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266646.9376333-1807-84134373610297/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:47 compute-0 sudo[204401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:47 compute-0 python3[204403]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:10:47 compute-0 podman[204441]: 2025-09-30 21:10:47.813654425 +0000 UTC m=+0.053260179 container create c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:47 compute-0 podman[204441]: 2025-09-30 21:10:47.782220167 +0000 UTC m=+0.021825941 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 21:10:47 compute-0 python3[204403]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Sep 30 21:10:47 compute-0 sudo[204401]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:48 compute-0 sudo[204629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzlzmtweqwymmqqqvownwgrmefvkhvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266648.195919-1831-63258770012349/AnsiballZ_stat.py'
Sep 30 21:10:48 compute-0 sudo[204629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:48 compute-0 python3.9[204631]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:10:48 compute-0 sudo[204629]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:49 compute-0 sudo[204783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yadbbvfwsdturpzqgczsieksfmgntnnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.2087145-1858-156955571898941/AnsiballZ_file.py'
Sep 30 21:10:49 compute-0 sudo[204783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:49 compute-0 python3.9[204785]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:49 compute-0 sudo[204783]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:50 compute-0 sudo[204934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtbkcmvakewqdzdrkzqzdejkulhgjpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.8002532-1858-115167132771518/AnsiballZ_copy.py'
Sep 30 21:10:50 compute-0 sudo[204934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:50 compute-0 python3.9[204936]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266649.8002532-1858-115167132771518/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:10:50 compute-0 sudo[204934]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:50 compute-0 sudo[205010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gehzrhscjusyyqxqkmvwsoytxtmkzrif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.8002532-1858-115167132771518/AnsiballZ_systemd.py'
Sep 30 21:10:50 compute-0 sudo[205010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:51 compute-0 python3.9[205012]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:10:51 compute-0 systemd[1]: Reloading.
Sep 30 21:10:51 compute-0 systemd-rc-local-generator[205037]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:51 compute-0 systemd-sysv-generator[205043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:51 compute-0 sudo[205010]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:51 compute-0 sudo[205121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nltxlahgouzlitcjoomehaykuowktkkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266649.8002532-1858-115167132771518/AnsiballZ_systemd.py'
Sep 30 21:10:51 compute-0 sudo[205121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:52 compute-0 python3.9[205123]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:10:52 compute-0 systemd[1]: Reloading.
Sep 30 21:10:52 compute-0 systemd-rc-local-generator[205150]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:10:52 compute-0 systemd-sysv-generator[205153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:10:52 compute-0 systemd[1]: Starting node_exporter container...
Sep 30 21:10:52 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:10:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d776320abbce92674e6009c9ab33af21fc3fa6eb58eaf1e7d3f9ddbccd4d296/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d776320abbce92674e6009c9ab33af21fc3fa6eb58eaf1e7d3f9ddbccd4d296/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:52 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.
Sep 30 21:10:52 compute-0 podman[205163]: 2025-09-30 21:10:52.868727659 +0000 UTC m=+0.171848486 container init c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.888Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.888Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.888Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.889Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.889Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=arp
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=bcache
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=bonding
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=btrfs
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=conntrack
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=cpu
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=cpufreq
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=diskstats
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=edac
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=fibrechannel
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=filefd
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=filesystem
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=infiniband
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=ipvs
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=loadavg
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=mdadm
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=meminfo
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=netclass
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=netdev
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=netstat
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=nfs
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=nfsd
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=nvme
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=schedstat
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=sockstat
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=softnet
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=systemd
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=tapestats
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=udp_queues
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=vmstat
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=xfs
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.890Z caller=node_exporter.go:117 level=info collector=zfs
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.891Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Sep 30 21:10:52 compute-0 node_exporter[205178]: ts=2025-09-30T21:10:52.892Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Sep 30 21:10:52 compute-0 podman[205163]: 2025-09-30 21:10:52.904949965 +0000 UTC m=+0.208070752 container start c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:52 compute-0 podman[205163]: node_exporter
Sep 30 21:10:52 compute-0 systemd[1]: Started node_exporter container.
Sep 30 21:10:52 compute-0 sudo[205121]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:52 compute-0 podman[205187]: 2025-09-30 21:10:52.993291332 +0000 UTC m=+0.073249544 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:53 compute-0 sudo[205362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prjtmszvifilfyjlvtshltwmhashrwpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266653.1844938-1930-43035032154053/AnsiballZ_systemd.py'
Sep 30 21:10:53 compute-0 sudo[205362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:53 compute-0 python3.9[205364]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:10:53 compute-0 systemd[1]: Stopping node_exporter container...
Sep 30 21:10:53 compute-0 systemd[1]: libpod-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.scope: Deactivated successfully.
Sep 30 21:10:53 compute-0 podman[205368]: 2025-09-30 21:10:53.951853312 +0000 UTC m=+0.051583018 container died c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:53 compute-0 systemd[1]: c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4-50fb2a9d1bdff934.timer: Deactivated successfully.
Sep 30 21:10:53 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.
Sep 30 21:10:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:10:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d776320abbce92674e6009c9ab33af21fc3fa6eb58eaf1e7d3f9ddbccd4d296-merged.mount: Deactivated successfully.
Sep 30 21:10:54 compute-0 podman[205368]: 2025-09-30 21:10:54.010459653 +0000 UTC m=+0.110189339 container cleanup c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:54 compute-0 podman[205368]: node_exporter
Sep 30 21:10:54 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:10:54 compute-0 podman[205396]: node_exporter
Sep 30 21:10:54 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Sep 30 21:10:54 compute-0 systemd[1]: Stopped node_exporter container.
Sep 30 21:10:54 compute-0 systemd[1]: Starting node_exporter container...
Sep 30 21:10:54 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:10:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d776320abbce92674e6009c9ab33af21fc3fa6eb58eaf1e7d3f9ddbccd4d296/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d776320abbce92674e6009c9ab33af21fc3fa6eb58eaf1e7d3f9ddbccd4d296/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:10:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.
Sep 30 21:10:54 compute-0 podman[205410]: 2025-09-30 21:10:54.254364441 +0000 UTC m=+0.131133218 container init c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.266Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.267Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.267Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.267Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.267Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=arp
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=bcache
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=bonding
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=btrfs
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=conntrack
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=cpu
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=cpufreq
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=diskstats
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=edac
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=fibrechannel
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=filefd
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=filesystem
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=infiniband
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=ipvs
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=loadavg
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=mdadm
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=meminfo
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=netclass
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=netdev
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=netstat
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=nfs
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=nfsd
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=nvme
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=schedstat
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=sockstat
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=softnet
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=systemd
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=tapestats
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=udp_queues
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.268Z caller=node_exporter.go:117 level=info collector=vmstat
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.269Z caller=node_exporter.go:117 level=info collector=xfs
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.269Z caller=node_exporter.go:117 level=info collector=zfs
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.269Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Sep 30 21:10:54 compute-0 node_exporter[205425]: ts=2025-09-30T21:10:54.270Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Sep 30 21:10:54 compute-0 podman[205410]: 2025-09-30 21:10:54.278903348 +0000 UTC m=+0.155672065 container start c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:10:54 compute-0 podman[205410]: node_exporter
Sep 30 21:10:54 compute-0 systemd[1]: Started node_exporter container.
Sep 30 21:10:54 compute-0 sudo[205362]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:54 compute-0 podman[205434]: 2025-09-30 21:10:54.36863851 +0000 UTC m=+0.071469301 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:10:55 compute-0 sudo[205608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffhrzphqrnkfmopyfjuqlpqtoxthppwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266654.8363214-1954-256020391851301/AnsiballZ_stat.py'
Sep 30 21:10:55 compute-0 sudo[205608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:55 compute-0 python3.9[205610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:10:55 compute-0 sudo[205608]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:55 compute-0 sudo[205731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qywulixuqhfpsfwmksdqisopbkcebrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266654.8363214-1954-256020391851301/AnsiballZ_copy.py'
Sep 30 21:10:55 compute-0 sudo[205731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:56 compute-0 python3.9[205733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266654.8363214-1954-256020391851301/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:10:56 compute-0 sudo[205731]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:56 compute-0 sudo[205883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gykjpytsgxrakncqrlzkeaphvmahqfgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266656.5406704-2005-181007425017801/AnsiballZ_container_config_data.py'
Sep 30 21:10:56 compute-0 sudo[205883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:57 compute-0 python3.9[205885]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Sep 30 21:10:57 compute-0 sudo[205883]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:57 compute-0 sudo[206035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyflqwdpjzcvbawgcxarkaitsrveeslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266657.4602993-2032-595082253535/AnsiballZ_container_config_hash.py'
Sep 30 21:10:57 compute-0 sudo[206035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:58 compute-0 python3.9[206037]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:10:58 compute-0 sudo[206035]: pam_unix(sudo:session): session closed for user root
Sep 30 21:10:58 compute-0 sudo[206187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onlbnrpqdryemoouduhpihzytrbiaiuo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266658.3823907-2062-252468057379527/AnsiballZ_edpm_container_manage.py'
Sep 30 21:10:58 compute-0 sudo[206187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:10:58 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:60788 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:10:59 compute-0 python3[206189]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:11:00 compute-0 podman[206203]: 2025-09-30 21:11:00.482290957 +0000 UTC m=+1.396706107 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 21:11:00 compute-0 podman[206298]: 2025-09-30 21:11:00.674603587 +0000 UTC m=+0.071504971 container create caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:11:00 compute-0 podman[206298]: 2025-09-30 21:11:00.640322508 +0000 UTC m=+0.037223952 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 21:11:00 compute-0 python3[206189]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Sep 30 21:11:00 compute-0 sudo[206187]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:01 compute-0 sudo[206486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywrddkopnbuuhpxwimsqidhopcmeccoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266661.5067532-2086-79460962423999/AnsiballZ_stat.py'
Sep 30 21:11:01 compute-0 sudo[206486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:01 compute-0 podman[206488]: 2025-09-30 21:11:01.973021341 +0000 UTC m=+0.061005472 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:11:02 compute-0 python3.9[206489]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:11:02 compute-0 sudo[206486]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:02 compute-0 sudo[206661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umtjkbyxuysdgnphsyzzxhvundhamvvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266662.5610387-2113-113670820210144/AnsiballZ_file.py'
Sep 30 21:11:02 compute-0 sudo[206661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:03 compute-0 python3.9[206663]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:03 compute-0 sudo[206661]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:03 compute-0 sudo[206812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpwdhrciibyfrqnkrjxhooieitwiqikm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.2550077-2113-107221729706078/AnsiballZ_copy.py'
Sep 30 21:11:03 compute-0 sudo[206812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:04 compute-0 python3.9[206814]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266663.2550077-2113-107221729706078/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:04 compute-0 sudo[206812]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:04 compute-0 sudo[206888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlxhqujgldfvmqznwuzuwkrlatrzrgod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.2550077-2113-107221729706078/AnsiballZ_systemd.py'
Sep 30 21:11:04 compute-0 sudo[206888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:04 compute-0 python3.9[206890]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:11:04 compute-0 systemd[1]: Reloading.
Sep 30 21:11:04 compute-0 systemd-sysv-generator[206917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:04 compute-0 systemd-rc-local-generator[206912]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:05 compute-0 sudo[206888]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:05 compute-0 sudo[206999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yevofptpdwmvsrswkxspuuvwbntbcvsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266663.2550077-2113-107221729706078/AnsiballZ_systemd.py'
Sep 30 21:11:05 compute-0 sudo[206999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:05 compute-0 python3.9[207001]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:11:05 compute-0 systemd[1]: Reloading.
Sep 30 21:11:05 compute-0 systemd-rc-local-generator[207026]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:05 compute-0 systemd-sysv-generator[207035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:06 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 21:11:06 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:11:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5e49084033a0bfd110a05f7b81476041d33cd4afb5775b3abced56305822ab/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5e49084033a0bfd110a05f7b81476041d33cd4afb5775b3abced56305822ab/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.
Sep 30 21:11:06 compute-0 podman[207042]: 2025-09-30 21:11:06.33921183 +0000 UTC m=+0.162252678 container init caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.368Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.368Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.369Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.369Z caller=handler.go:105 level=info collector=container
Sep 30 21:11:06 compute-0 podman[207042]: 2025-09-30 21:11:06.37719973 +0000 UTC m=+0.200240578 container start caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:11:06 compute-0 podman[207042]: podman_exporter
Sep 30 21:11:06 compute-0 systemd[1]: Starting Podman API Service...
Sep 30 21:11:06 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 21:11:06 compute-0 systemd[1]: Started Podman API Service.
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="/usr/bin/podman filtering at log level info"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="Setting parallel job count to 25"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="Using sqlite as database backend"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="Using systemd socket activation to determine API endpoint"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Sep 30 21:11:06 compute-0 sudo[206999]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:06 compute-0 podman[207069]: @ - - [30/Sep/2025:21:11:06 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 21:11:06 compute-0 podman[207069]: time="2025-09-30T21:11:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 21:11:06 compute-0 podman[207066]: 2025-09-30 21:11:06.48542267 +0000 UTC m=+0.087369864 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:11:06 compute-0 podman[207068]: 2025-09-30 21:11:06.485933022 +0000 UTC m=+0.073327476 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:11:06 compute-0 podman[207069]: @ - - [30/Sep/2025:21:11:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22060 "" "Go-http-client/1.1"
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.491Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.492Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 21:11:06 compute-0 podman_exporter[207057]: ts=2025-09-30T21:11:06.492Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 21:11:06 compute-0 systemd[1]: caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4-5421988407aa1cf2.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:11:06 compute-0 systemd[1]: caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4-5421988407aa1cf2.service: Failed with result 'exit-code'.
Sep 30 21:11:07 compute-0 sudo[207268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhlxdtysvtrakmuitokojuvayrnughsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266666.636772-2185-5747115005315/AnsiballZ_systemd.py'
Sep 30 21:11:07 compute-0 sudo[207268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:07 compute-0 python3.9[207270]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:11:07 compute-0 systemd[1]: Stopping podman_exporter container...
Sep 30 21:11:07 compute-0 podman[207069]: @ - - [30/Sep/2025:21:11:06 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3549 "" "Go-http-client/1.1"
Sep 30 21:11:07 compute-0 systemd[1]: libpod-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.scope: Deactivated successfully.
Sep 30 21:11:07 compute-0 podman[207274]: 2025-09-30 21:11:07.559354786 +0000 UTC m=+0.050325937 container died caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-0 systemd[1]: caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4-5421988407aa1cf2.timer: Deactivated successfully.
Sep 30 21:11:07 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.
Sep 30 21:11:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:11:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a5e49084033a0bfd110a05f7b81476041d33cd4afb5775b3abced56305822ab-merged.mount: Deactivated successfully.
Sep 30 21:11:07 compute-0 podman[207274]: 2025-09-30 21:11:07.792301812 +0000 UTC m=+0.283272993 container cleanup caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:07 compute-0 podman[207274]: podman_exporter
Sep 30 21:11:07 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:11:07 compute-0 podman[207304]: podman_exporter
Sep 30 21:11:07 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Sep 30 21:11:07 compute-0 systemd[1]: Stopped podman_exporter container.
Sep 30 21:11:07 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 21:11:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:11:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5e49084033a0bfd110a05f7b81476041d33cd4afb5775b3abced56305822ab/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5e49084033a0bfd110a05f7b81476041d33cd4afb5775b3abced56305822ab/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.
Sep 30 21:11:08 compute-0 podman[207317]: 2025-09-30 21:11:08.029092934 +0000 UTC m=+0.123377415 container init caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.041Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.041Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.041Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.041Z caller=handler.go:105 level=info collector=container
Sep 30 21:11:08 compute-0 podman[207069]: @ - - [30/Sep/2025:21:11:08 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 21:11:08 compute-0 podman[207069]: time="2025-09-30T21:11:08Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 21:11:08 compute-0 podman[207317]: 2025-09-30 21:11:08.060733617 +0000 UTC m=+0.155018078 container start caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:11:08 compute-0 podman[207317]: podman_exporter
Sep 30 21:11:08 compute-0 podman[207069]: @ - - [30/Sep/2025:21:11:08 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22062 "" "Go-http-client/1.1"
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.069Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.070Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 21:11:08 compute-0 podman_exporter[207332]: ts=2025-09-30T21:11:08.070Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 21:11:08 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 21:11:08 compute-0 sudo[207268]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:08 compute-0 podman[207342]: 2025-09-30 21:11:08.116032306 +0000 UTC m=+0.044624716 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:11:08 compute-0 sudo[207515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yixrmljzyprciacccgdwhmrlzaldcsiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266668.3483653-2209-84580770464184/AnsiballZ_stat.py'
Sep 30 21:11:08 compute-0 sudo[207515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:08 compute-0 python3.9[207517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:11:08 compute-0 sudo[207515]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:09 compute-0 sudo[207638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvleuojeiqpzcsmrrtnresylygpjrzar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266668.3483653-2209-84580770464184/AnsiballZ_copy.py'
Sep 30 21:11:09 compute-0 sudo[207638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:09 compute-0 python3.9[207640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759266668.3483653-2209-84580770464184/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 21:11:09 compute-0 sudo[207638]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:10 compute-0 sudo[207790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofjvxhdrbfexevwobtldhvvqeiksidfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266670.286815-2260-19232171304956/AnsiballZ_container_config_data.py'
Sep 30 21:11:10 compute-0 sudo[207790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:10 compute-0 python3.9[207792]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Sep 30 21:11:10 compute-0 sudo[207790]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:11 compute-0 podman[207830]: 2025-09-30 21:11:11.385189157 +0000 UTC m=+0.115382167 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:11:11 compute-0 sudo[207968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeglsitcfxdpevoimjescokfdprojzwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266671.278272-2287-13030945798243/AnsiballZ_container_config_hash.py'
Sep 30 21:11:11 compute-0 sudo[207968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:11 compute-0 python3.9[207970]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 21:11:11 compute-0 sudo[207968]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:12 compute-0 sudo[208120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lotkazdtncmhxtesoqxlutwkueaovfbi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266672.2406306-2317-276278335133258/AnsiballZ_edpm_container_manage.py'
Sep 30 21:11:12 compute-0 sudo[208120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:12 compute-0 python3[208122]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 21:11:13 compute-0 podman[208149]: 2025-09-30 21:11:13.319246287 +0000 UTC m=+0.057138246 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Sep 30 21:11:13 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:11:13 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Failed with result 'exit-code'.
Sep 30 21:11:15 compute-0 podman[208135]: 2025-09-30 21:11:15.266909132 +0000 UTC m=+2.293722344 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-0 podman[208249]: 2025-09-30 21:11:15.415544152 +0000 UTC m=+0.055329811 container create c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:11:15 compute-0 podman[208249]: 2025-09-30 21:11:15.387420656 +0000 UTC m=+0.027206315 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-0 python3[208122]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 21:11:15 compute-0 sudo[208120]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:16 compute-0 podman[208358]: 2025-09-30 21:11:16.334232945 +0000 UTC m=+0.059489113 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:11:16 compute-0 sudo[208454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tczdmymrmgbdakeedgzupksycwbxwnqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266676.1785665-2341-227342864307296/AnsiballZ_stat.py'
Sep 30 21:11:16 compute-0 sudo[208454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:16 compute-0 python3.9[208456]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:11:16 compute-0 sudo[208454]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:17 compute-0 sudo[208608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwojupbhtkazimygkgwhhegdkdmjddg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.0493157-2368-115929630419438/AnsiballZ_file.py'
Sep 30 21:11:17 compute-0 sudo[208608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:17 compute-0 python3.9[208610]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:17 compute-0 sudo[208608]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:18 compute-0 sudo[208759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoqfeprmfbpyiznwuslwksavqwffopkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7278085-2368-160924515334886/AnsiballZ_copy.py'
Sep 30 21:11:18 compute-0 sudo[208759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:18 compute-0 python3.9[208761]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759266677.7278085-2368-160924515334886/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:18 compute-0 sudo[208759]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:18 compute-0 sudo[208835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elgrvsoolnqevtydditpbqvsgismrwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7278085-2368-160924515334886/AnsiballZ_systemd.py'
Sep 30 21:11:18 compute-0 sudo[208835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:19 compute-0 python3.9[208837]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 21:11:19 compute-0 systemd[1]: Reloading.
Sep 30 21:11:19 compute-0 systemd-rc-local-generator[208864]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:19 compute-0 systemd-sysv-generator[208868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:19 compute-0 sudo[208835]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:19 compute-0 sudo[208946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnquxavrwryobhogtwffonrnbnofbryi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266677.7278085-2368-160924515334886/AnsiballZ_systemd.py'
Sep 30 21:11:19 compute-0 sudo[208946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:20 compute-0 python3.9[208948]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 21:11:20 compute-0 systemd[1]: Reloading.
Sep 30 21:11:20 compute-0 systemd-sysv-generator[208980]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 21:11:20 compute-0 systemd-rc-local-generator[208975]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 21:11:20 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 21:11:20 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:11:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:20 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.
Sep 30 21:11:20 compute-0 podman[208988]: 2025-09-30 21:11:20.79559957 +0000 UTC m=+0.151450230 container init c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350)
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *bridge.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *coverage.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *datapath.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *iface.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *memory.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *ovnnorthd.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *ovn.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *ovsdbserver.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *pmd_perf.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *pmd_rxq.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: INFO    21:11:20 main.go:48: registering *vswitch.Collector
Sep 30 21:11:20 compute-0 openstack_network_exporter[209004]: NOTICE  21:11:20 main.go:76: listening on https://:9105/metrics
Sep 30 21:11:20 compute-0 podman[208988]: 2025-09-30 21:11:20.83111277 +0000 UTC m=+0.186963500 container start c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Sep 30 21:11:20 compute-0 podman[208988]: openstack_network_exporter
Sep 30 21:11:20 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 21:11:20 compute-0 sudo[208946]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:20 compute-0 podman[209014]: 2025-09-30 21:11:20.955262783 +0000 UTC m=+0.106946369 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Sep 30 21:11:21 compute-0 sudo[209186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjpxsftgpxzddwqtxhdwxmnxofzzfccs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266681.134254-2440-246527595631668/AnsiballZ_systemd.py'
Sep 30 21:11:21 compute-0 sudo[209186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:21 compute-0 python3.9[209188]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 21:11:21 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Sep 30 21:11:21 compute-0 systemd[1]: libpod-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.scope: Deactivated successfully.
Sep 30 21:11:22 compute-0 podman[209192]: 2025-09-30 21:11:22.000968611 +0000 UTC m=+0.068677631 container died c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, architecture=x86_64, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:11:22 compute-0 systemd[1]: c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c-780dd58027fa5b09.timer: Deactivated successfully.
Sep 30 21:11:22 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.
Sep 30 21:11:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:11:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43-merged.mount: Deactivated successfully.
Sep 30 21:11:22 compute-0 podman[209192]: 2025-09-30 21:11:22.644399109 +0000 UTC m=+0.712108129 container cleanup c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9)
Sep 30 21:11:22 compute-0 podman[209192]: openstack_network_exporter
Sep 30 21:11:22 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 21:11:22 compute-0 podman[209221]: openstack_network_exporter
Sep 30 21:11:22 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Sep 30 21:11:22 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Sep 30 21:11:22 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 21:11:22 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:11:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9ece910286067149e6fab01a3e27a77c9d848e17828478b9417447e6928a8e43/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 21:11:22 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.
Sep 30 21:11:22 compute-0 podman[209234]: 2025-09-30 21:11:22.910903196 +0000 UTC m=+0.159490039 container init c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, managed_by=edpm_ansible)
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *bridge.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *coverage.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *datapath.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *iface.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *memory.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *ovnnorthd.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *ovn.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *ovsdbserver.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *pmd_perf.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *pmd_rxq.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: INFO    21:11:22 main.go:48: registering *vswitch.Collector
Sep 30 21:11:22 compute-0 openstack_network_exporter[209250]: NOTICE  21:11:22 main.go:76: listening on https://:9105/metrics
Sep 30 21:11:22 compute-0 podman[209234]: 2025-09-30 21:11:22.947385949 +0000 UTC m=+0.195972782 container start c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Sep 30 21:11:22 compute-0 podman[209234]: openstack_network_exporter
Sep 30 21:11:22 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 21:11:22 compute-0 sudo[209186]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:23 compute-0 podman[209260]: 2025-09-30 21:11:23.054697836 +0000 UTC m=+0.089084986 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, architecture=x86_64, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=)
Sep 30 21:11:23 compute-0 sudo[209432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokddgeohofhibpdfuplfkgwbwdkwdbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266683.307965-2464-249412941436330/AnsiballZ_find.py'
Sep 30 21:11:23 compute-0 sudo[209432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:23 compute-0 python3.9[209434]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 21:11:23 compute-0 sudo[209432]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:25 compute-0 podman[209459]: 2025-09-30 21:11:25.344657447 +0000 UTC m=+0.078479654 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.170 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.171 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.198 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.199 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.199 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.200 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.234 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.234 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.234 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.235 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.430 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.431 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5931MB free_disk=73.50337600708008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.431 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.432 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.509 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.509 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.633 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.650 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.653 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:11:28 compute-0 nova_compute[192810]: 2025-09-30 21:11:28.653 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.243 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.243 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.244 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.258 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.258 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:11:29 compute-0 nova_compute[192810]: 2025-09-30 21:11:29.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:11:32 compute-0 podman[209484]: 2025-09-30 21:11:32.355272842 +0000 UTC m=+0.078443093 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:11:37 compute-0 podman[209504]: 2025-09-30 21:11:37.319401393 +0000 UTC m=+0.054940341 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd)
Sep 30 21:11:37 compute-0 sshd-session[209524]: Invalid user alex from 45.81.23.80 port 50410
Sep 30 21:11:37 compute-0 sshd-session[209524]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:11:37 compute-0 sshd-session[209524]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:11:38 compute-0 podman[209526]: 2025-09-30 21:11:38.381287602 +0000 UTC m=+0.109950453 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:11:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:11:38.713 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:11:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:11:38.714 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:11:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:11:38.714 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:11:39 compute-0 sshd-session[209524]: Failed password for invalid user alex from 45.81.23.80 port 50410 ssh2
Sep 30 21:11:41 compute-0 sshd-session[209524]: Received disconnect from 45.81.23.80 port 50410:11: Bye Bye [preauth]
Sep 30 21:11:41 compute-0 sshd-session[209524]: Disconnected from invalid user alex 45.81.23.80 port 50410 [preauth]
Sep 30 21:11:42 compute-0 podman[209550]: 2025-09-30 21:11:42.394198936 +0000 UTC m=+0.119515920 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Sep 30 21:11:44 compute-0 podman[209576]: 2025-09-30 21:11:44.348081435 +0000 UTC m=+0.075472439 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:11:44 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 21:11:44 compute-0 systemd[1]: f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110-1258ad40e00476e1.service: Failed with result 'exit-code'.
Sep 30 21:11:47 compute-0 podman[209595]: 2025-09-30 21:11:47.321513364 +0000 UTC m=+0.062584627 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Sep 30 21:11:52 compute-0 sudo[209740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmvcujhrdkykrpedosjfwzzoolzjxzlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266712.367364-2756-186003996976957/AnsiballZ_podman_container_info.py'
Sep 30 21:11:52 compute-0 sudo[209740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:52 compute-0 python3.9[209742]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Sep 30 21:11:52 compute-0 sudo[209740]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:53 compute-0 podman[209828]: 2025-09-30 21:11:53.324362176 +0000 UTC m=+0.060525216 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350)
Sep 30 21:11:53 compute-0 sudo[209926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwolzgjdmswvrflgscvqujcgqjcpnedc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266713.1850324-2764-5591282544221/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:53 compute-0 sudo[209926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:53 compute-0 python3.9[209928]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:53 compute-0 systemd[1]: Started libpod-conmon-aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660.scope.
Sep 30 21:11:53 compute-0 podman[209929]: 2025-09-30 21:11:53.816462614 +0000 UTC m=+0.084683047 container exec aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:11:53 compute-0 podman[209929]: 2025-09-30 21:11:53.855961906 +0000 UTC m=+0.124182289 container exec_died aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:11:53 compute-0 systemd[1]: libpod-conmon-aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660.scope: Deactivated successfully.
Sep 30 21:11:53 compute-0 sudo[209926]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:54 compute-0 sudo[210111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pertmlxxqewyezzaeypsuvpxtfwsnfrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266714.0812001-2772-199279508567657/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:54 compute-0 sudo[210111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:54 compute-0 python3.9[210113]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:54 compute-0 systemd[1]: Started libpod-conmon-aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660.scope.
Sep 30 21:11:54 compute-0 podman[210114]: 2025-09-30 21:11:54.836374135 +0000 UTC m=+0.104230683 container exec aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:11:54 compute-0 podman[210114]: 2025-09-30 21:11:54.871934079 +0000 UTC m=+0.139790607 container exec_died aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible)
Sep 30 21:11:54 compute-0 systemd[1]: libpod-conmon-aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660.scope: Deactivated successfully.
Sep 30 21:11:54 compute-0 sudo[210111]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:55 compute-0 sudo[210312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asjcoetgmasrtemglhskrigwibrnmiyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266715.133152-2780-77917524332212/AnsiballZ_file.py'
Sep 30 21:11:55 compute-0 sudo[210312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:55 compute-0 podman[210269]: 2025-09-30 21:11:55.505807291 +0000 UTC m=+0.083274171 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:11:55 compute-0 python3.9[210322]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:55 compute-0 sudo[210312]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:56 compute-0 sudo[210472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzxdlfhtblaqyeoifnpgpmxemgddxqhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266715.9537547-2789-98607777594826/AnsiballZ_podman_container_info.py'
Sep 30 21:11:56 compute-0 sudo[210472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:56 compute-0 python3.9[210474]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Sep 30 21:11:56 compute-0 sudo[210472]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:57 compute-0 sudo[210637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tewthfqnwzhofjvftqspwziwsokaqtum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266716.8425548-2797-119192744829555/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:57 compute-0 sudo[210637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:57 compute-0 python3.9[210639]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:57 compute-0 systemd[1]: Started libpod-conmon-b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84.scope.
Sep 30 21:11:57 compute-0 podman[210640]: 2025-09-30 21:11:57.517759323 +0000 UTC m=+0.075840047 container exec b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:11:57 compute-0 podman[210640]: 2025-09-30 21:11:57.553967983 +0000 UTC m=+0.112048697 container exec_died b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:11:57 compute-0 systemd[1]: libpod-conmon-b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84.scope: Deactivated successfully.
Sep 30 21:11:57 compute-0 sudo[210637]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:58 compute-0 sudo[210819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwvsygfsgjcxgyihkbsyuikvvuqnbnjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266717.7941122-2805-145073417565637/AnsiballZ_podman_container_exec.py'
Sep 30 21:11:58 compute-0 sudo[210819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:58 compute-0 python3.9[210821]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:11:58 compute-0 systemd[1]: Started libpod-conmon-b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84.scope.
Sep 30 21:11:58 compute-0 podman[210822]: 2025-09-30 21:11:58.655778521 +0000 UTC m=+0.165643030 container exec b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 21:11:58 compute-0 podman[210822]: 2025-09-30 21:11:58.690946026 +0000 UTC m=+0.200810365 container exec_died b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:11:58 compute-0 sudo[210819]: pam_unix(sudo:session): session closed for user root
Sep 30 21:11:58 compute-0 systemd[1]: libpod-conmon-b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84.scope: Deactivated successfully.
Sep 30 21:11:59 compute-0 sudo[211004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksripenkctkzqjyebisakynlfrrvxdyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266718.9255033-2813-263225865647216/AnsiballZ_file.py'
Sep 30 21:11:59 compute-0 sudo[211004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:11:59 compute-0 python3.9[211006]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:11:59 compute-0 sudo[211004]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:00 compute-0 sudo[211156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbdgylfmqxolddppxyfvspxqgikljvib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266719.7373922-2822-201596737518914/AnsiballZ_podman_container_info.py'
Sep 30 21:12:00 compute-0 sudo[211156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:00 compute-0 python3.9[211158]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Sep 30 21:12:00 compute-0 sudo[211156]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:00 compute-0 sudo[211321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfgraxbmfjxkeyogncrtsqeufcdzozvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266720.5987499-2830-278119522002019/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:00 compute-0 sudo[211321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:01 compute-0 python3.9[211323]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:01 compute-0 systemd[1]: Started libpod-conmon-bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672.scope.
Sep 30 21:12:01 compute-0 podman[211324]: 2025-09-30 21:12:01.236769023 +0000 UTC m=+0.080429631 container exec bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:12:01 compute-0 podman[211324]: 2025-09-30 21:12:01.275007624 +0000 UTC m=+0.118668212 container exec_died bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:12:01 compute-0 systemd[1]: libpod-conmon-bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672.scope: Deactivated successfully.
Sep 30 21:12:01 compute-0 sudo[211321]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:01 compute-0 sudo[211503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geqnqtqncbrznvqsgextklovzvntwesu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266721.5269413-2838-258078027490981/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:01 compute-0 sudo[211503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:02 compute-0 python3.9[211505]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:02 compute-0 systemd[1]: Started libpod-conmon-bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672.scope.
Sep 30 21:12:02 compute-0 podman[211506]: 2025-09-30 21:12:02.246114511 +0000 UTC m=+0.074400621 container exec bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:12:02 compute-0 podman[211506]: 2025-09-30 21:12:02.281022989 +0000 UTC m=+0.109309099 container exec_died bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:12:02 compute-0 systemd[1]: libpod-conmon-bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672.scope: Deactivated successfully.
Sep 30 21:12:02 compute-0 sudo[211503]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:02 compute-0 sudo[211701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajknmouqtkponyldlhgqighbsakjykx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266722.536927-2846-114652884996821/AnsiballZ_file.py'
Sep 30 21:12:02 compute-0 sudo[211701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:02 compute-0 podman[211662]: 2025-09-30 21:12:02.9554357 +0000 UTC m=+0.082650936 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:12:03 compute-0 python3.9[211707]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:03 compute-0 sudo[211701]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:03 compute-0 sudo[211860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqxkdzgntkqoedohahbwfxzphvwbnhwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266723.403175-2855-28803419873315/AnsiballZ_podman_container_info.py'
Sep 30 21:12:03 compute-0 sudo[211860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:03 compute-0 python3.9[211862]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Sep 30 21:12:03 compute-0 sudo[211860]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:04 compute-0 sudo[212025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqhtsqgfpuzsvhvocjmjuacvftlxfgqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266724.243741-2863-127972286012782/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:04 compute-0 sudo[212025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:04 compute-0 python3.9[212027]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:04 compute-0 systemd[1]: Started libpod-conmon-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.scope.
Sep 30 21:12:04 compute-0 podman[212028]: 2025-09-30 21:12:04.965337581 +0000 UTC m=+0.110341485 container exec 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.schema-version=1.0)
Sep 30 21:12:05 compute-0 podman[212028]: 2025-09-30 21:12:05.00395847 +0000 UTC m=+0.148962324 container exec_died 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:12:05 compute-0 systemd[1]: libpod-conmon-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.scope: Deactivated successfully.
Sep 30 21:12:05 compute-0 sudo[212025]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:05 compute-0 sudo[212210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgnsmgwbqmxiklxiuxeabsrxqrxvcjum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266725.276891-2871-141665298178907/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:05 compute-0 sudo[212210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:05 compute-0 python3.9[212212]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:05 compute-0 systemd[1]: Started libpod-conmon-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.scope.
Sep 30 21:12:05 compute-0 podman[212213]: 2025-09-30 21:12:05.985123089 +0000 UTC m=+0.099170478 container exec 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:12:06 compute-0 podman[212213]: 2025-09-30 21:12:06.01816176 +0000 UTC m=+0.132209059 container exec_died 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:12:06 compute-0 systemd[1]: libpod-conmon-5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581.scope: Deactivated successfully.
Sep 30 21:12:06 compute-0 sudo[212210]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:06 compute-0 sudo[212395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxtaswhcvxofdgcbxhcstcvxnjqvnxws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266726.291902-2879-262247612746757/AnsiballZ_file.py'
Sep 30 21:12:06 compute-0 sudo[212395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:06 compute-0 python3.9[212397]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:06 compute-0 sudo[212395]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:07 compute-0 unix_chkpwd[212429]: password check failed for user (root)
Sep 30 21:12:07 compute-0 sshd-session[212044]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 21:12:07 compute-0 sudo[212563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfukrjmtykymzrozmhzsglsjmdekqflr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266727.1019192-2888-53668123308298/AnsiballZ_podman_container_info.py'
Sep 30 21:12:07 compute-0 sudo[212563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:07 compute-0 podman[212524]: 2025-09-30 21:12:07.518484209 +0000 UTC m=+0.095548387 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250923)
Sep 30 21:12:07 compute-0 python3.9[212572]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Sep 30 21:12:07 compute-0 sudo[212563]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:08 compute-0 sudo[212736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuduxltgdbdkjcmvjindmljujzooxhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266728.00386-2896-130612947270179/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:08 compute-0 sudo[212736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:08 compute-0 python3.9[212738]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:08 compute-0 systemd[1]: Started libpod-conmon-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope.
Sep 30 21:12:08 compute-0 podman[212739]: 2025-09-30 21:12:08.665772117 +0000 UTC m=+0.108134009 container exec f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:12:08 compute-0 podman[212739]: 2025-09-30 21:12:08.705170527 +0000 UTC m=+0.147532469 container exec_died f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:12:08 compute-0 systemd[1]: libpod-conmon-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope: Deactivated successfully.
Sep 30 21:12:08 compute-0 sudo[212736]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:08 compute-0 podman[212756]: 2025-09-30 21:12:08.75758197 +0000 UTC m=+0.093815253 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:12:09 compute-0 sshd-session[212044]: Failed password for root from 185.156.73.233 port 24948 ssh2
Sep 30 21:12:09 compute-0 sudo[212942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otfuxjugevzaeiqovdhotidljjluooha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266728.9492075-2904-139013016845260/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:09 compute-0 sudo[212942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:09 compute-0 python3.9[212944]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:09 compute-0 systemd[1]: Started libpod-conmon-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope.
Sep 30 21:12:09 compute-0 podman[212945]: 2025-09-30 21:12:09.643994933 +0000 UTC m=+0.097909606 container exec f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:12:09 compute-0 podman[212945]: 2025-09-30 21:12:09.675698951 +0000 UTC m=+0.129613624 container exec_died f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:12:09 compute-0 systemd[1]: libpod-conmon-f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110.scope: Deactivated successfully.
Sep 30 21:12:09 compute-0 sudo[212942]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:10 compute-0 sudo[213125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frzxzliqugxrtvqoomngvkxtxlvyriep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266729.8991003-2912-232001165716133/AnsiballZ_file.py'
Sep 30 21:12:10 compute-0 sudo[213125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:10 compute-0 python3.9[213127]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:10 compute-0 sudo[213125]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:10 compute-0 sshd-session[212044]: Connection closed by authenticating user root 185.156.73.233 port 24948 [preauth]
Sep 30 21:12:11 compute-0 sudo[213277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utheqzqibscgwuymxesqvnulfuxdzcex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266730.8041067-2921-137956642012079/AnsiballZ_podman_container_info.py'
Sep 30 21:12:11 compute-0 sudo[213277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:11 compute-0 python3.9[213279]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Sep 30 21:12:11 compute-0 sudo[213277]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:12 compute-0 sudo[213442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voqilzujbzylzcphxkvpnuutzhpormer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266731.7227685-2929-103106003242862/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:12 compute-0 sudo[213442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:12 compute-0 python3.9[213444]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:12 compute-0 systemd[1]: Started libpod-conmon-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.scope.
Sep 30 21:12:12 compute-0 podman[213445]: 2025-09-30 21:12:12.409202295 +0000 UTC m=+0.097435994 container exec c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:12:12 compute-0 podman[213445]: 2025-09-30 21:12:12.443661152 +0000 UTC m=+0.131894841 container exec_died c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:12:12 compute-0 systemd[1]: libpod-conmon-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.scope: Deactivated successfully.
Sep 30 21:12:12 compute-0 sudo[213442]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:12 compute-0 podman[213478]: 2025-09-30 21:12:12.64667539 +0000 UTC m=+0.148225307 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:12:13 compute-0 sudo[213656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbzeravzgvoqwbtqpdwzpclyjfbeues ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266732.8070624-2937-159997949103127/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:13 compute-0 sudo[213656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:13 compute-0 python3.9[213658]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:13 compute-0 systemd[1]: Started libpod-conmon-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.scope.
Sep 30 21:12:13 compute-0 podman[213659]: 2025-09-30 21:12:13.420187535 +0000 UTC m=+0.076787481 container exec c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:12:13 compute-0 podman[213659]: 2025-09-30 21:12:13.456038206 +0000 UTC m=+0.112638082 container exec_died c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:12:13 compute-0 systemd[1]: libpod-conmon-c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4.scope: Deactivated successfully.
Sep 30 21:12:13 compute-0 sudo[213656]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:14 compute-0 sudo[213838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydungbzophkweprgcdfxvrusgogjzwpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266733.6909127-2945-164599588500932/AnsiballZ_file.py'
Sep 30 21:12:14 compute-0 sudo[213838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:14 compute-0 python3.9[213840]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:14 compute-0 sudo[213838]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:14 compute-0 sudo[214002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynqguammsujwwtkyqzuaeodillowvrsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266734.4894223-2954-280254382751479/AnsiballZ_podman_container_info.py'
Sep 30 21:12:14 compute-0 sudo[214002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:14 compute-0 podman[213964]: 2025-09-30 21:12:14.822610759 +0000 UTC m=+0.077125329 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:12:15 compute-0 python3.9[214010]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Sep 30 21:12:15 compute-0 sudo[214002]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:15 compute-0 sudo[214174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnxgnmrmttrhpumzwxctsiqnhebdyaas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266735.326769-2962-187012933056911/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:15 compute-0 sudo[214174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:15 compute-0 python3.9[214176]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:15 compute-0 systemd[1]: Started libpod-conmon-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.scope.
Sep 30 21:12:16 compute-0 podman[214177]: 2025-09-30 21:12:16.016752783 +0000 UTC m=+0.107384311 container exec caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:12:16 compute-0 podman[214177]: 2025-09-30 21:12:16.053035365 +0000 UTC m=+0.143666803 container exec_died caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:12:16 compute-0 systemd[1]: libpod-conmon-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.scope: Deactivated successfully.
Sep 30 21:12:16 compute-0 sudo[214174]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:16 compute-0 sudo[214357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhuvteoavrnncvaudapkdbyltmgyxxip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266736.288255-2970-1931496553016/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:16 compute-0 sudo[214357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:16 compute-0 python3.9[214359]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:16 compute-0 systemd[1]: Started libpod-conmon-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.scope.
Sep 30 21:12:16 compute-0 podman[214360]: 2025-09-30 21:12:16.92987977 +0000 UTC m=+0.088578844 container exec caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:12:16 compute-0 podman[214360]: 2025-09-30 21:12:16.96608589 +0000 UTC m=+0.124784944 container exec_died caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:12:16 compute-0 systemd[1]: libpod-conmon-caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4.scope: Deactivated successfully.
Sep 30 21:12:17 compute-0 sudo[214357]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:17 compute-0 sudo[214556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lawpivhivgdgwgnhceiwbpprizvnujkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266737.1880836-2978-225947697705804/AnsiballZ_file.py'
Sep 30 21:12:17 compute-0 sudo[214556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:17 compute-0 podman[214516]: 2025-09-30 21:12:17.536404922 +0000 UTC m=+0.078366079 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:12:17 compute-0 python3.9[214562]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:17 compute-0 sudo[214556]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:18 compute-0 sudo[214714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atbifynueeyfwmljowjtdbuvcksomnqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266737.9507399-2987-8412428070639/AnsiballZ_podman_container_info.py'
Sep 30 21:12:18 compute-0 sudo[214714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:18 compute-0 python3.9[214716]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Sep 30 21:12:18 compute-0 sudo[214714]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:19 compute-0 sudo[214879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sotvpfikoxnmelraymxpaqelvpiojiox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266738.820986-2995-60334753746178/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:19 compute-0 sudo[214879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:19 compute-0 python3.9[214881]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:19 compute-0 systemd[1]: Started libpod-conmon-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.scope.
Sep 30 21:12:19 compute-0 podman[214882]: 2025-09-30 21:12:19.397127362 +0000 UTC m=+0.082943533 container exec c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Sep 30 21:12:19 compute-0 podman[214882]: 2025-09-30 21:12:19.436090861 +0000 UTC m=+0.121906952 container exec_died c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:12:19 compute-0 systemd[1]: libpod-conmon-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.scope: Deactivated successfully.
Sep 30 21:12:19 compute-0 sudo[214879]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:20 compute-0 sudo[215064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaztpzftyixguezxmzxcgbcpfxerhlir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266739.6766508-3003-140313155060360/AnsiballZ_podman_container_exec.py'
Sep 30 21:12:20 compute-0 sudo[215064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:20 compute-0 python3.9[215066]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 21:12:20 compute-0 systemd[1]: Started libpod-conmon-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.scope.
Sep 30 21:12:20 compute-0 podman[215067]: 2025-09-30 21:12:20.309280964 +0000 UTC m=+0.077963629 container exec c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Sep 30 21:12:20 compute-0 podman[215067]: 2025-09-30 21:12:20.34045772 +0000 UTC m=+0.109140425 container exec_died c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=)
Sep 30 21:12:20 compute-0 systemd[1]: libpod-conmon-c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c.scope: Deactivated successfully.
Sep 30 21:12:20 compute-0 sudo[215064]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:20 compute-0 sudo[215248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odjilsmjlnmvjnmaiebpuywlbhocjztt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266740.5866177-3011-257983085281317/AnsiballZ_file.py'
Sep 30 21:12:20 compute-0 sudo[215248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:21 compute-0 python3.9[215250]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:21 compute-0 sudo[215248]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:24 compute-0 podman[215275]: 2025-09-30 21:12:24.356319371 +0000 UTC m=+0.085421785 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6)
Sep 30 21:12:26 compute-0 podman[215299]: 2025-09-30 21:12:26.379108153 +0000 UTC m=+0.110470948 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:12:27 compute-0 nova_compute[192810]: 2025-09-30 21:12:27.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.838 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.839 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.839 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.839 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.985 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.986 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5960MB free_disk=73.50325393676758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.987 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:28 compute-0 nova_compute[192810]: 2025-09-30 21:12:28.987 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.104 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.104 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.131 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.157 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.159 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:12:29 compute-0 nova_compute[192810]: 2025-09-30 21:12:29.159 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.159 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.159 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.160 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.179 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.180 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.180 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:30 compute-0 nova_compute[192810]: 2025-09-30 21:12:30.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:12:31 compute-0 nova_compute[192810]: 2025-09-30 21:12:31.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:12:33 compute-0 podman[215323]: 2025-09-30 21:12:33.319390565 +0000 UTC m=+0.059302855 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:12:34 compute-0 sshd-session[215343]: Invalid user dell from 45.81.23.80 port 45256
Sep 30 21:12:34 compute-0 sshd-session[215343]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:12:34 compute-0 sshd-session[215343]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:12:35 compute-0 sshd-session[215343]: Failed password for invalid user dell from 45.81.23.80 port 45256 ssh2
Sep 30 21:12:36 compute-0 sshd-session[215343]: Received disconnect from 45.81.23.80 port 45256:11: Bye Bye [preauth]
Sep 30 21:12:36 compute-0 sshd-session[215343]: Disconnected from invalid user dell 45.81.23.80 port 45256 [preauth]
Sep 30 21:12:38 compute-0 podman[215345]: 2025-09-30 21:12:38.336072534 +0000 UTC m=+0.078650427 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:12:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:12:38.715 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:12:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:12:38.715 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:12:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:12:38.715 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:12:39 compute-0 podman[215365]: 2025-09-30 21:12:39.353667399 +0000 UTC m=+0.077730824 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:12:43 compute-0 podman[215391]: 2025-09-30 21:12:43.349033471 +0000 UTC m=+0.080239186 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:12:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:12:45 compute-0 podman[215418]: 2025-09-30 21:12:45.349870735 +0000 UTC m=+0.089112577 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:12:48 compute-0 podman[215440]: 2025-09-30 21:12:48.332928024 +0000 UTC m=+0.070118184 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:12:48 compute-0 sshd-session[215438]: Invalid user FAKESSH from 113.240.110.90 port 50860
Sep 30 21:12:48 compute-0 sshd-session[215438]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:12:48 compute-0 sshd-session[215438]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=113.240.110.90
Sep 30 21:12:51 compute-0 sshd-session[215438]: Failed password for invalid user FAKESSH from 113.240.110.90 port 50860 ssh2
Sep 30 21:12:52 compute-0 sudo[215584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfkvytzfmhbnaiwdtbjaesqbiszysjqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266772.485998-3286-271924984448740/AnsiballZ_file.py'
Sep 30 21:12:52 compute-0 sudo[215584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:53 compute-0 python3.9[215586]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:53 compute-0 sudo[215584]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:53 compute-0 sudo[215736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvhqltznqpedccwlhqzcxolmruhqtdbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266773.2629874-3310-107190579401690/AnsiballZ_stat.py'
Sep 30 21:12:53 compute-0 sudo[215736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:53 compute-0 python3.9[215738]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:53 compute-0 sudo[215736]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:54 compute-0 sudo[215859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymbhftjkscvyftlpqskyyeqcneqjmayz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266773.2629874-3310-107190579401690/AnsiballZ_copy.py'
Sep 30 21:12:54 compute-0 sudo[215859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:54 compute-0 python3.9[215861]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759266773.2629874-3310-107190579401690/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:54 compute-0 sudo[215859]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:55 compute-0 sudo[216021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evjiaqdjniwqykjgejftfyhrswrjgnmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266774.8826299-3358-158960023141948/AnsiballZ_file.py'
Sep 30 21:12:55 compute-0 sudo[216021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:55 compute-0 podman[215985]: 2025-09-30 21:12:55.302808564 +0000 UTC m=+0.103620799 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:12:55 compute-0 python3.9[216023]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:55 compute-0 sudo[216021]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:56 compute-0 sudo[216181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chybwparzbhwuzvvppoanssrofkiaslj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266775.7140079-3382-209418771350663/AnsiballZ_stat.py'
Sep 30 21:12:56 compute-0 sudo[216181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:56 compute-0 python3.9[216183]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:56 compute-0 sudo[216181]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:56 compute-0 sudo[216269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgxkgaiykhuhkmqgdmvokgryktchmvtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266775.7140079-3382-209418771350663/AnsiballZ_file.py'
Sep 30 21:12:56 compute-0 sudo[216269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:56 compute-0 podman[216233]: 2025-09-30 21:12:56.71151856 +0000 UTC m=+0.092782774 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:12:56 compute-0 python3.9[216273]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:56 compute-0 sudo[216269]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:57 compute-0 PackageKit[130622]: daemon quit
Sep 30 21:12:57 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 21:12:57 compute-0 sudo[216433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmltewhwatgortyzhddxylrgsoysviah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266777.1676834-3418-204125451271407/AnsiballZ_stat.py'
Sep 30 21:12:57 compute-0 sudo[216433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:57 compute-0 python3.9[216435]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:57 compute-0 sudo[216433]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:57 compute-0 sudo[216511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhwmkvbimqkvfhzazxtbaudcnvucxkwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266777.1676834-3418-204125451271407/AnsiballZ_file.py'
Sep 30 21:12:57 compute-0 sudo[216511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:58 compute-0 python3.9[216513]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.x2un3912 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:58 compute-0 sudo[216511]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:58 compute-0 sudo[216663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffyjfrecjcmtuzlzkqsrxsgrjcaroxpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266778.402216-3454-261814705130712/AnsiballZ_stat.py'
Sep 30 21:12:58 compute-0 sudo[216663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:58 compute-0 python3.9[216665]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:12:59 compute-0 sudo[216663]: pam_unix(sudo:session): session closed for user root
Sep 30 21:12:59 compute-0 sudo[216741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlzwixxsqhwnoalxnbusekcsqmhcqsif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266778.402216-3454-261814705130712/AnsiballZ_file.py'
Sep 30 21:12:59 compute-0 sudo[216741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:12:59 compute-0 python3.9[216743]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:12:59 compute-0 sudo[216741]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:00 compute-0 sudo[216893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpgkshjoimxwddiccxrmbchjwselrexo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266779.8767986-3493-43111115784543/AnsiballZ_command.py'
Sep 30 21:13:00 compute-0 sudo[216893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:00 compute-0 python3.9[216895]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:00 compute-0 sudo[216893]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:01 compute-0 sudo[217046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvyphgbkjsygtnovskjvngshaxxoetmq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759266780.7257504-3517-111237527088548/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 21:13:01 compute-0 sudo[217046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:01 compute-0 python3[217048]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 21:13:01 compute-0 sudo[217046]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:02 compute-0 sudo[217198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqomegdnpzovitmdlkahokrcdkjvxeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266781.7848008-3541-157049511482197/AnsiballZ_stat.py'
Sep 30 21:13:02 compute-0 sudo[217198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:02 compute-0 python3.9[217200]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:02 compute-0 sudo[217198]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:02 compute-0 sudo[217276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpjcazxrybbhyjoczdlgyohrxevaxcov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266781.7848008-3541-157049511482197/AnsiballZ_file.py'
Sep 30 21:13:02 compute-0 sudo[217276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:02 compute-0 python3.9[217278]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:02 compute-0 sudo[217276]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:03 compute-0 sudo[217445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rthxsyyikmsgtuaajxnvkmbvulqrrvgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266783.1801121-3577-182527056197938/AnsiballZ_stat.py'
Sep 30 21:13:03 compute-0 sudo[217445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:03 compute-0 podman[217402]: 2025-09-30 21:13:03.63607177 +0000 UTC m=+0.079167048 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:13:03 compute-0 python3.9[217450]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:03 compute-0 sudo[217445]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:04 compute-0 sudo[217526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpqcoyyyzsaylaqxexnpxrkytkheuxtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266783.1801121-3577-182527056197938/AnsiballZ_file.py'
Sep 30 21:13:04 compute-0 sudo[217526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:04 compute-0 python3.9[217528]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:04 compute-0 sudo[217526]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:04 compute-0 sudo[217678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abvhazqhhmhcbeiiyhhjojbykqxvjsnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266784.6047-3613-135842017121985/AnsiballZ_stat.py'
Sep 30 21:13:04 compute-0 sudo[217678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:05 compute-0 python3.9[217680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:05 compute-0 sudo[217678]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:05 compute-0 sudo[217756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlxlbukmnfcejismbpfnxjybzhdhvpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266784.6047-3613-135842017121985/AnsiballZ_file.py'
Sep 30 21:13:05 compute-0 sudo[217756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:05 compute-0 python3.9[217758]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:05 compute-0 sudo[217756]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:06 compute-0 sudo[217908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-repulxhxuvhpttjoyzhzbywgcohyhvkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266786.0398698-3649-86972260478095/AnsiballZ_stat.py'
Sep 30 21:13:06 compute-0 sudo[217908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:06 compute-0 python3.9[217910]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:06 compute-0 sudo[217908]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:06 compute-0 sudo[217986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdcvbwyrcgxbzeshjiynjcdbqtnfafxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266786.0398698-3649-86972260478095/AnsiballZ_file.py'
Sep 30 21:13:06 compute-0 sudo[217986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:07 compute-0 python3.9[217988]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:07 compute-0 sudo[217986]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:07 compute-0 sudo[218138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czmqgjeiurxdsukjnyocoecuxcnpjeku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266787.3741555-3685-56304152145820/AnsiballZ_stat.py'
Sep 30 21:13:07 compute-0 sudo[218138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:08 compute-0 python3.9[218140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 21:13:08 compute-0 sudo[218138]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:08 compute-0 sudo[218277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcxwmkimwuyyzuudiponvblccgxlrvlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266787.3741555-3685-56304152145820/AnsiballZ_copy.py'
Sep 30 21:13:08 compute-0 sudo[218277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:08 compute-0 podman[218237]: 2025-09-30 21:13:08.693275066 +0000 UTC m=+0.061991453 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:13:08 compute-0 python3.9[218283]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759266787.3741555-3685-56304152145820/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:08 compute-0 sudo[218277]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:09 compute-0 podman[218407]: 2025-09-30 21:13:09.505316909 +0000 UTC m=+0.058805462 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:13:09 compute-0 sudo[218448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-porkbsblyrglxpcuflgpehudylcrqnde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266789.146043-3730-67742023411260/AnsiballZ_file.py'
Sep 30 21:13:09 compute-0 sudo[218448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:09 compute-0 python3.9[218459]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:09 compute-0 sudo[218448]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:10 compute-0 sudo[218609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krkforsdffgufplcuaxinlpxgdfdzrja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266789.9179258-3754-256463703321822/AnsiballZ_command.py'
Sep 30 21:13:10 compute-0 sudo[218609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:10 compute-0 python3.9[218611]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:10 compute-0 sudo[218609]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:11 compute-0 sudo[218764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hotrtvhjpydqcqfsflzsgymaplwihwnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266790.7775407-3778-85475893534838/AnsiballZ_blockinfile.py'
Sep 30 21:13:11 compute-0 sudo[218764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:11 compute-0 python3.9[218766]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:11 compute-0 sudo[218764]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:12 compute-0 sudo[218916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzoycqtmmtonnxjwbdzodfdvfyexbepu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266791.8997571-3805-40106714694313/AnsiballZ_command.py'
Sep 30 21:13:12 compute-0 sudo[218916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:12 compute-0 python3.9[218918]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:12 compute-0 sudo[218916]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:13 compute-0 sudo[219069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euiakptrjhjpbkgdqzuhpfbdlbyhrucw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266792.7423558-3829-81970478672676/AnsiballZ_stat.py'
Sep 30 21:13:13 compute-0 sudo[219069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:13 compute-0 python3.9[219071]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 21:13:13 compute-0 sudo[219069]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:14 compute-0 sudo[219231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlocfbymcgiwgsuvrpkvokxljeymnien ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266793.678717-3853-193352856201225/AnsiballZ_command.py'
Sep 30 21:13:14 compute-0 sudo[219231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:14 compute-0 podman[219197]: 2025-09-30 21:13:14.091211442 +0000 UTC m=+0.107352434 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:13:14 compute-0 python3.9[219242]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 21:13:14 compute-0 sudo[219231]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:14 compute-0 sudo[219404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiitfejoelgtcngrospuqbezrsqchetg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759266794.5970757-3877-134767980972022/AnsiballZ_file.py'
Sep 30 21:13:15 compute-0 sudo[219404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:13:15 compute-0 python3.9[219406]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 21:13:15 compute-0 sudo[219404]: pam_unix(sudo:session): session closed for user root
Sep 30 21:13:15 compute-0 sshd-session[193135]: Connection closed by 192.168.122.30 port 54078
Sep 30 21:13:15 compute-0 sshd-session[193132]: pam_unix(sshd:session): session closed for user zuul
Sep 30 21:13:15 compute-0 systemd-logind[792]: Session 27 logged out. Waiting for processes to exit.
Sep 30 21:13:15 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Sep 30 21:13:15 compute-0 systemd[1]: session-27.scope: Consumed 1min 55.636s CPU time.
Sep 30 21:13:15 compute-0 systemd-logind[792]: Removed session 27.
Sep 30 21:13:15 compute-0 podman[219431]: 2025-09-30 21:13:15.843193804 +0000 UTC m=+0.076257975 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20250923, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:13:19 compute-0 podman[219451]: 2025-09-30 21:13:19.360060275 +0000 UTC m=+0.083154940 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:13:26 compute-0 podman[219470]: 2025-09-30 21:13:26.335515598 +0000 UTC m=+0.068298853 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Sep 30 21:13:27 compute-0 podman[219491]: 2025-09-30 21:13:27.315805808 +0000 UTC m=+0.051257080 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:13:28 compute-0 nova_compute[192810]: 2025-09-30 21:13:28.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:28 compute-0 nova_compute[192810]: 2025-09-30 21:13:28.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.831 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.831 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:29 compute-0 nova_compute[192810]: 2025-09-30 21:13:29.831 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.016 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.017 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5994MB free_disk=73.50276184082031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.017 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.018 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.075 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.076 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.097 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.110 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.112 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:13:30 compute-0 nova_compute[192810]: 2025-09-30 21:13:30.112 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:30 compute-0 unix_chkpwd[219518]: password check failed for user (root)
Sep 30 21:13:30 compute-0 sshd-session[219516]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:13:31 compute-0 nova_compute[192810]: 2025-09-30 21:13:31.093 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:31 compute-0 nova_compute[192810]: 2025-09-30 21:13:31.094 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:13:31 compute-0 nova_compute[192810]: 2025-09-30 21:13:31.094 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.968 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.968 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.968 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.969 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.969 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:13:32 compute-0 nova_compute[192810]: 2025-09-30 21:13:32.969 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:13:33 compute-0 sshd-session[219516]: Failed password for root from 45.81.23.80 port 40106 ssh2
Sep 30 21:13:34 compute-0 podman[219519]: 2025-09-30 21:13:34.345509739 +0000 UTC m=+0.073732581 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:13:34 compute-0 sshd-session[219516]: Received disconnect from 45.81.23.80 port 40106:11: Bye Bye [preauth]
Sep 30 21:13:34 compute-0 sshd-session[219516]: Disconnected from authenticating user root 45.81.23.80 port 40106 [preauth]
Sep 30 21:13:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:38.716 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:13:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:38.716 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:13:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:38.716 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:13:39 compute-0 podman[219539]: 2025-09-30 21:13:39.345471361 +0000 UTC m=+0.079991489 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:13:40 compute-0 podman[219558]: 2025-09-30 21:13:40.352537932 +0000 UTC m=+0.075100846 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:13:44 compute-0 podman[219582]: 2025-09-30 21:13:44.388278912 +0000 UTC m=+0.124211041 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:13:46 compute-0 podman[219608]: 2025-09-30 21:13:46.326341342 +0000 UTC m=+0.066856797 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm)
Sep 30 21:13:50 compute-0 podman[219628]: 2025-09-30 21:13:50.327579397 +0000 UTC m=+0.063428090 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:13:57 compute-0 podman[219648]: 2025-09-30 21:13:57.34084706 +0000 UTC m=+0.069538535 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:13:57 compute-0 podman[219669]: 2025-09-30 21:13:57.422218734 +0000 UTC m=+0.049735493 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:13:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:58.236 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:13:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:58.239 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:13:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:13:58.240 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:14:05 compute-0 podman[219694]: 2025-09-30 21:14:05.349158228 +0000 UTC m=+0.070637122 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:14:06 compute-0 sshd[128205]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 212398
Sep 30 21:14:10 compute-0 podman[219714]: 2025-09-30 21:14:10.328523449 +0000 UTC m=+0.066575220 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:14:11 compute-0 podman[219733]: 2025-09-30 21:14:11.334944883 +0000 UTC m=+0.071954726 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:14:15 compute-0 podman[219757]: 2025-09-30 21:14:15.419833199 +0000 UTC m=+0.156040619 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 21:14:17 compute-0 podman[219785]: 2025-09-30 21:14:17.334841055 +0000 UTC m=+0.069707458 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible)
Sep 30 21:14:21 compute-0 podman[219805]: 2025-09-30 21:14:21.345696043 +0000 UTC m=+0.070657273 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:14:24 compute-0 sshd-session[219825]: Invalid user git from 45.81.23.80 port 34952
Sep 30 21:14:24 compute-0 sshd-session[219825]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:14:24 compute-0 sshd-session[219825]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:14:26 compute-0 sshd-session[219825]: Failed password for invalid user git from 45.81.23.80 port 34952 ssh2
Sep 30 21:14:27 compute-0 sshd-session[219825]: Received disconnect from 45.81.23.80 port 34952:11: Bye Bye [preauth]
Sep 30 21:14:27 compute-0 sshd-session[219825]: Disconnected from invalid user git 45.81.23.80 port 34952 [preauth]
Sep 30 21:14:27 compute-0 nova_compute[192810]: 2025-09-30 21:14:27.790 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:27 compute-0 nova_compute[192810]: 2025-09-30 21:14:27.790 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:14:28 compute-0 nova_compute[192810]: 2025-09-30 21:14:28.062 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:14:28 compute-0 nova_compute[192810]: 2025-09-30 21:14:28.063 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:28 compute-0 nova_compute[192810]: 2025-09-30 21:14:28.063 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:14:28 compute-0 nova_compute[192810]: 2025-09-30 21:14:28.078 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:28 compute-0 podman[219827]: 2025-09-30 21:14:28.33117992 +0000 UTC m=+0.062367083 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:14:28 compute-0 podman[219828]: 2025-09-30 21:14:28.338693971 +0000 UTC m=+0.069429262 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git)
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.090 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.828 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:29 compute-0 nova_compute[192810]: 2025-09-30 21:14:29.830 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.023 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.025 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6064MB free_disk=73.50280380249023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.026 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.026 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.178 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.178 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.199 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.298 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.299 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.312 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.330 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.347 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.363 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.366 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:14:30 compute-0 nova_compute[192810]: 2025-09-30 21:14:30.367 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:31 compute-0 nova_compute[192810]: 2025-09-30 21:14:31.362 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:31 compute-0 nova_compute[192810]: 2025-09-30 21:14:31.363 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:31 compute-0 nova_compute[192810]: 2025-09-30 21:14:31.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:31 compute-0 nova_compute[192810]: 2025-09-30 21:14:31.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:14:32 compute-0 nova_compute[192810]: 2025-09-30 21:14:32.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:32 compute-0 nova_compute[192810]: 2025-09-30 21:14:32.790 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:14:32 compute-0 nova_compute[192810]: 2025-09-30 21:14:32.790 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:14:32 compute-0 nova_compute[192810]: 2025-09-30 21:14:32.805 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:14:33 compute-0 nova_compute[192810]: 2025-09-30 21:14:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:33 compute-0 nova_compute[192810]: 2025-09-30 21:14:33.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:14:36 compute-0 podman[219869]: 2025-09-30 21:14:36.359547047 +0000 UTC m=+0.094440706 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Sep 30 21:14:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:14:38.717 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:14:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:14:38.718 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:14:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:14:38.718 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:14:41 compute-0 podman[219890]: 2025-09-30 21:14:41.324157404 +0000 UTC m=+0.060146176 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:14:42 compute-0 podman[219910]: 2025-09-30 21:14:42.323748825 +0000 UTC m=+0.064431235 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.901 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.902 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.903 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.904 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:14:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:14:46 compute-0 podman[219934]: 2025-09-30 21:14:46.426112923 +0000 UTC m=+0.157455644 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:14:48 compute-0 podman[219963]: 2025-09-30 21:14:48.360159762 +0000 UTC m=+0.084148045 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:14:48 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 215438
Sep 30 21:14:48 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:37608 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:14:52 compute-0 podman[219985]: 2025-09-30 21:14:52.35713557 +0000 UTC m=+0.093135563 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:14:59 compute-0 podman[220006]: 2025-09-30 21:14:59.330773907 +0000 UTC m=+0.058831653 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:14:59 compute-0 podman[220007]: 2025-09-30 21:14:59.348535107 +0000 UTC m=+0.074320326 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Sep 30 21:15:07 compute-0 podman[220051]: 2025-09-30 21:15:07.351544609 +0000 UTC m=+0.078802799 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:15:12 compute-0 podman[220071]: 2025-09-30 21:15:12.341508788 +0000 UTC m=+0.074592992 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 21:15:12 compute-0 podman[220093]: 2025-09-30 21:15:12.43028669 +0000 UTC m=+0.059974772 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:15:16 compute-0 sshd-session[220117]: Invalid user superadmin from 45.81.23.80 port 58034
Sep 30 21:15:16 compute-0 sshd-session[220117]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:15:16 compute-0 sshd-session[220117]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:15:17 compute-0 podman[220119]: 2025-09-30 21:15:17.357631902 +0000 UTC m=+0.090617600 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:15:18 compute-0 sshd-session[220117]: Failed password for invalid user superadmin from 45.81.23.80 port 58034 ssh2
Sep 30 21:15:18 compute-0 sshd-session[220117]: Received disconnect from 45.81.23.80 port 58034:11: Bye Bye [preauth]
Sep 30 21:15:18 compute-0 sshd-session[220117]: Disconnected from invalid user superadmin 45.81.23.80 port 58034 [preauth]
Sep 30 21:15:19 compute-0 podman[220145]: 2025-09-30 21:15:19.332535793 +0000 UTC m=+0.063420598 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:15:23 compute-0 podman[220165]: 2025-09-30 21:15:23.323014686 +0000 UTC m=+0.058435454 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:15:30 compute-0 podman[220185]: 2025-09-30 21:15:30.317069803 +0000 UTC m=+0.050228430 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, version=9.6, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible)
Sep 30 21:15:30 compute-0 podman[220184]: 2025-09-30 21:15:30.345756996 +0000 UTC m=+0.082370049 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:15:30 compute-0 nova_compute[192810]: 2025-09-30 21:15:30.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:31 compute-0 nova_compute[192810]: 2025-09-30 21:15:31.843 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.046 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.047 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6093MB free_disk=73.50280380249023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.047 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.047 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.143 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.143 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.176 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.192 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.194 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:15:32 compute-0 nova_compute[192810]: 2025-09-30 21:15:32.194 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:33 compute-0 nova_compute[192810]: 2025-09-30 21:15:33.190 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:33 compute-0 nova_compute[192810]: 2025-09-30 21:15:33.205 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:33 compute-0 nova_compute[192810]: 2025-09-30 21:15:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:33 compute-0 nova_compute[192810]: 2025-09-30 21:15:33.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:33 compute-0 nova_compute[192810]: 2025-09-30 21:15:33.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:15:34 compute-0 sshd[128205]: drop connection #0 from [49.64.169.153]:41417 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:15:34 compute-0 nova_compute[192810]: 2025-09-30 21:15:34.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:34 compute-0 nova_compute[192810]: 2025-09-30 21:15:34.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:15:34 compute-0 nova_compute[192810]: 2025-09-30 21:15:34.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:15:34 compute-0 nova_compute[192810]: 2025-09-30 21:15:34.802 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:15:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:34.903 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:15:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:34.904 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:15:35 compute-0 nova_compute[192810]: 2025-09-30 21:15:35.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:15:38 compute-0 podman[220230]: 2025-09-30 21:15:38.34162551 +0000 UTC m=+0.072986915 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:15:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:38.718 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:38.719 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:38.719 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:15:41.906 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:15:43 compute-0 podman[220250]: 2025-09-30 21:15:43.357282109 +0000 UTC m=+0.076393570 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:15:43 compute-0 podman[220251]: 2025-09-30 21:15:43.383190933 +0000 UTC m=+0.096700884 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.544 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.545 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.570 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.718 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.718 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.724 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.724 2 INFO nova.compute.claims [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.864 2 DEBUG nova.compute.provider_tree [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.880 2 DEBUG nova.scheduler.client.report [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.914 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:47 compute-0 nova_compute[192810]: 2025-09-30 21:15:47.915 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.107 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.107 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.133 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.155 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.320 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.321 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.322 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Creating image(s)
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.322 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.323 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.323 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.324 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.324 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:48 compute-0 podman[220290]: 2025-09-30 21:15:48.348506252 +0000 UTC m=+0.081933468 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:15:48 compute-0 nova_compute[192810]: 2025-09-30 21:15:48.966 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Automatically allocating a network for project 9ee15beb428a4c5e9726b65c80920da9. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.226 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.296 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.298 2 DEBUG nova.virt.images [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] 86b6907c-d747-4e98-8897-42105915831d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.299 2 DEBUG nova.privsep.utils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.299 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-0 podman[220318]: 2025-09-30 21:15:50.319997788 +0000 UTC m=+0.057537961 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute)
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.443 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.part /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.449 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.527 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a.converted --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.528 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:50 compute-0 nova_compute[192810]: 2025-09-30 21:15:50.545 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpehi2qalm/privsep.sock']
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.251 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.078 54 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.082 54 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.084 54 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.084 54 INFO oslo.privsep.daemon [-] privsep daemon running as pid 54
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.342 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.404 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.405 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.406 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.421 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.483 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.485 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.521 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.523 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.523 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.605 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.607 2 DEBUG nova.virt.disk.api [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Checking if we can resize image /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.607 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.670 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.671 2 DEBUG nova.virt.disk.api [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Cannot resize image /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.672 2 DEBUG nova.objects.instance [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7a2c317e-5f8f-4337-9052-2231a2173f23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.700 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.701 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Ensure instance console log exists: /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.702 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.703 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:15:51 compute-0 nova_compute[192810]: 2025-09-30 21:15:51.703 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:15:54 compute-0 podman[220371]: 2025-09-30 21:15:54.327878193 +0000 UTC m=+0.062942046 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 21:16:01 compute-0 podman[220390]: 2025-09-30 21:16:01.315579122 +0000 UTC m=+0.051633325 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:16:01 compute-0 podman[220391]: 2025-09-30 21:16:01.362222872 +0000 UTC m=+0.084439071 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Sep 30 21:16:06 compute-0 sshd-session[220436]: Invalid user cuckoo from 45.81.23.80 port 52880
Sep 30 21:16:06 compute-0 sshd-session[220436]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:16:06 compute-0 sshd-session[220436]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:16:08 compute-0 sshd-session[220436]: Failed password for invalid user cuckoo from 45.81.23.80 port 52880 ssh2
Sep 30 21:16:09 compute-0 podman[220438]: 2025-09-30 21:16:09.328944563 +0000 UTC m=+0.065926947 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:09 compute-0 sshd-session[220436]: Received disconnect from 45.81.23.80 port 52880:11: Bye Bye [preauth]
Sep 30 21:16:09 compute-0 sshd-session[220436]: Disconnected from invalid user cuckoo 45.81.23.80 port 52880 [preauth]
Sep 30 21:16:14 compute-0 podman[220459]: 2025-09-30 21:16:14.341463945 +0000 UTC m=+0.067599179 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:16:14 compute-0 podman[220458]: 2025-09-30 21:16:14.370037462 +0000 UTC m=+0.096306410 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:16:16 compute-0 nova_compute[192810]: 2025-09-30 21:16:16.665 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Automatically allocated network: {'id': '776029b5-6115-4a10-b114-56157e8d42e8', 'name': 'auto_allocated_network', 'tenant_id': '9ee15beb428a4c5e9726b65c80920da9', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['20d80525-eb42-4d6a-9761-1ab47a242279', 'af065b13-5619-478f-8481-1ded808231da'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2025-09-30T21:15:49Z', 'updated_at': '2025-09-30T21:16:07Z', 'revision_number': 4, 'project_id': '9ee15beb428a4c5e9726b65c80920da9'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478
Sep 30 21:16:16 compute-0 nova_compute[192810]: 2025-09-30 21:16:16.681 2 WARNING oslo_policy.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Sep 30 21:16:16 compute-0 nova_compute[192810]: 2025-09-30 21:16:16.682 2 WARNING oslo_policy.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Sep 30 21:16:16 compute-0 nova_compute[192810]: 2025-09-30 21:16:16.687 2 DEBUG nova.policy [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b0c6f6dda88549aabf6a4b039406af8e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9ee15beb428a4c5e9726b65c80920da9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:16:18 compute-0 nova_compute[192810]: 2025-09-30 21:16:18.861 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Successfully created port: 15bad28b-1806-4dd5-8233-16a23f243c4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:16:19 compute-0 podman[220503]: 2025-09-30 21:16:19.389224481 +0000 UTC m=+0.117981715 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:16:21 compute-0 podman[220529]: 2025-09-30 21:16:21.352139095 +0000 UTC m=+0.086856923 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Sep 30 21:16:21 compute-0 nova_compute[192810]: 2025-09-30 21:16:21.397 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Successfully updated port: 15bad28b-1806-4dd5-8233-16a23f243c4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:16:21 compute-0 nova_compute[192810]: 2025-09-30 21:16:21.417 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:21 compute-0 nova_compute[192810]: 2025-09-30 21:16:21.418 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquired lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:21 compute-0 nova_compute[192810]: 2025-09-30 21:16:21.418 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:21 compute-0 nova_compute[192810]: 2025-09-30 21:16:21.583 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:16:22 compute-0 nova_compute[192810]: 2025-09-30 21:16:22.675 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-changed-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:22 compute-0 nova_compute[192810]: 2025-09-30 21:16:22.676 2 DEBUG nova.compute.manager [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Refreshing instance network info cache due to event network-changed-15bad28b-1806-4dd5-8233-16a23f243c4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:16:22 compute-0 nova_compute[192810]: 2025-09-30 21:16:22.676 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.378 2 DEBUG nova.network.neutron [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Updating instance_info_cache with network_info: [{"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.409 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Releasing lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.410 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Instance network_info: |[{"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.411 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.412 2 DEBUG nova.network.neutron [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Refreshing network info cache for port 15bad28b-1806-4dd5-8233-16a23f243c4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.417 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Start _get_guest_xml network_info=[{"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.424 2 WARNING nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.435 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.436 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.460 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.462 2 DEBUG nova.virt.libvirt.host [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.464 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.464 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.465 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.465 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.465 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.465 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.466 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.466 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.466 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.467 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.467 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.467 2 DEBUG nova.virt.hardware [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.472 2 DEBUG nova.privsep.utils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.474 2 DEBUG nova.virt.libvirt.vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-2',id=3,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:15:48Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=7a2c317e-5f8f-4337-9052-2231a2173f23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.475 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.478 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.480 2 DEBUG nova.objects.instance [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a2c317e-5f8f-4337-9052-2231a2173f23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.495 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <uuid>7a2c317e-5f8f-4337-9052-2231a2173f23</uuid>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <name>instance-00000003</name>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:name>tempest-tempest.common.compute-instance-423040831-2</nova:name>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:16:23</nova:creationTime>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:user uuid="b0c6f6dda88549aabf6a4b039406af8e">tempest-AutoAllocateNetworkTest-649897189-project-member</nova:user>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:project uuid="9ee15beb428a4c5e9726b65c80920da9">tempest-AutoAllocateNetworkTest-649897189</nova:project>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         <nova:port uuid="15bad28b-1806-4dd5-8233-16a23f243c4b">
Sep 30 21:16:23 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="fdfe:381f:8400::3e6" ipVersion="6"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.1.0.10" ipVersion="4"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <system>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="serial">7a2c317e-5f8f-4337-9052-2231a2173f23</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="uuid">7a2c317e-5f8f-4337-9052-2231a2173f23</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </system>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <os>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </os>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <features>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </features>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.config"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:e7:d1:20"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <target dev="tap15bad28b-18"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/console.log" append="off"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <video>
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </video>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:16:23 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:16:23 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:16:23 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:16:23 compute-0 nova_compute[192810]: </domain>
Sep 30 21:16:23 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.496 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Preparing to wait for external event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.497 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.497 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.497 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.498 2 DEBUG nova.virt.libvirt.vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-2',id=3,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:15:48Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=7a2c317e-5f8f-4337-9052-2231a2173f23,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.498 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.499 2 DEBUG nova.network.os_vif_util [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.499 2 DEBUG os_vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.543 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.543 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.543 2 DEBUG ovsdbapp.backend.ovs_idl [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.562 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.562 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:23 compute-0 nova_compute[192810]: 2025-09-30 21:16:23.563 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpzm_dp9wg/privsep.sock']
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.254 2 INFO oslo.privsep.daemon [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Spawned new privsep daemon via rootwrap
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.125 75 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.130 75 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.132 75 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.132 75 INFO oslo.privsep.daemon [-] privsep daemon running as pid 75
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.580 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15bad28b-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15bad28b-18, col_values=(('external_ids', {'iface-id': '15bad28b-1806-4dd5-8233-16a23f243c4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:d1:20', 'vm-uuid': '7a2c317e-5f8f-4337-9052-2231a2173f23'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-0 NetworkManager[51733]: <info>  [1759266984.5840] manager: (tap15bad28b-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.592 2 INFO os_vif [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18')
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.651 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.652 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.652 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] No VIF found with MAC fa:16:3e:e7:d1:20, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:16:24 compute-0 nova_compute[192810]: 2025-09-30 21:16:24.653 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Using config drive
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.072 2 INFO nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Creating config drive at /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.config
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.081 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbo4kh5iu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.125 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating tmpfile /var/lib/nova/instances/tmp0tc_e70n to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.209 2 DEBUG oslo_concurrency.processutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbo4kh5iu" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:25 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Sep 30 21:16:25 compute-0 NetworkManager[51733]: <info>  [1759266985.3157] manager: (tap15bad28b-18): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Sep 30 21:16:25 compute-0 kernel: tap15bad28b-18: entered promiscuous mode
Sep 30 21:16:25 compute-0 ovn_controller[94912]: 2025-09-30T21:16:25Z|00027|binding|INFO|Claiming lport 15bad28b-1806-4dd5-8233-16a23f243c4b for this chassis.
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:25 compute-0 ovn_controller[94912]: 2025-09-30T21:16:25Z|00028|binding|INFO|15bad28b-1806-4dd5-8233-16a23f243c4b: Claiming fa:16:3e:e7:d1:20 10.1.0.10 fdfe:381f:8400::3e6
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.348 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d1:20 10.1.0.10 fdfe:381f:8400::3e6'], port_security=['fa:16:3e:e7:d1:20 10.1.0.10 fdfe:381f:8400::3e6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.10/26 fdfe:381f:8400::3e6/64', 'neutron:device_id': '7a2c317e-5f8f-4337-9052-2231a2173f23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15bad28b-1806-4dd5-8233-16a23f243c4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.349 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15bad28b-1806-4dd5-8233-16a23f243c4b in datapath 776029b5-6115-4a10-b114-56157e8d42e8 bound to our chassis
Sep 30 21:16:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.351 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.352 103867 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpnzyj0ked/privsep.sock']
Sep 30 21:16:25 compute-0 podman[220565]: 2025-09-30 21:16:25.359476098 +0000 UTC m=+0.095012017 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:25 compute-0 systemd-udevd[220600]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:25 compute-0 systemd-machined[152794]: New machine qemu-1-instance-00000003.
Sep 30 21:16:25 compute-0 NetworkManager[51733]: <info>  [1759266985.3904] device (tap15bad28b-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:16:25 compute-0 NetworkManager[51733]: <info>  [1759266985.3919] device (tap15bad28b-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:16:25 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:25 compute-0 ovn_controller[94912]: 2025-09-30T21:16:25Z|00029|binding|INFO|Setting lport 15bad28b-1806-4dd5-8233-16a23f243c4b ovn-installed in OVS
Sep 30 21:16:25 compute-0 ovn_controller[94912]: 2025-09-30T21:16:25Z|00030|binding|INFO|Setting lport 15bad28b-1806-4dd5-8233-16a23f243c4b up in Southbound
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.474 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.513 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.513 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.527 2 INFO nova.compute.rpcapi [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.527 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.873 2 DEBUG nova.compute.manager [req-dd98fb9e-4b36-476a-b7fa-8a1c048c11b5 req-3a6842d9-f55e-4694-8e17-0e3d9c22b0c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.874 2 DEBUG oslo_concurrency.lockutils [req-dd98fb9e-4b36-476a-b7fa-8a1c048c11b5 req-3a6842d9-f55e-4694-8e17-0e3d9c22b0c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.874 2 DEBUG oslo_concurrency.lockutils [req-dd98fb9e-4b36-476a-b7fa-8a1c048c11b5 req-3a6842d9-f55e-4694-8e17-0e3d9c22b0c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.875 2 DEBUG oslo_concurrency.lockutils [req-dd98fb9e-4b36-476a-b7fa-8a1c048c11b5 req-3a6842d9-f55e-4694-8e17-0e3d9c22b0c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:25 compute-0 nova_compute[192810]: 2025-09-30 21:16:25.875 2 DEBUG nova.compute.manager [req-dd98fb9e-4b36-476a-b7fa-8a1c048c11b5 req-3a6842d9-f55e-4694-8e17-0e3d9c22b0c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Processing event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.075 103867 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.075 103867 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpnzyj0ked/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.918 220624 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.927 220624 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.931 220624 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:25.931 220624 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220624
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.078 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d8bde9a6-2734-4949-9872-61b5ca35eab8]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.240 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.241 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759266986.2403915, 7a2c317e-5f8f-4337-9052-2231a2173f23 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.241 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] VM Started (Lifecycle Event)
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.263 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.266 2 INFO nova.virt.libvirt.driver [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Instance spawned successfully.
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.266 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.269 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.272 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.289 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.289 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.290 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.290 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.290 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.291 2 DEBUG nova.virt.libvirt.driver [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.294 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.294 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759266986.2433193, 7a2c317e-5f8f-4337-9052-2231a2173f23 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.295 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] VM Paused (Lifecycle Event)
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.348 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.352 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759266986.245473, 7a2c317e-5f8f-4337-9052-2231a2173f23 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.353 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] VM Resumed (Lifecycle Event)
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.376 2 INFO nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Took 38.06 seconds to spawn the instance on the hypervisor.
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.378 2 DEBUG nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.381 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.391 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.443 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.496 2 DEBUG nova.network.neutron [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Updated VIF entry in instance network info cache for port 15bad28b-1806-4dd5-8233-16a23f243c4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.497 2 DEBUG nova.network.neutron [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Updating instance_info_cache with network_info: [{"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.559 2 DEBUG oslo_concurrency.lockutils [req-181e812b-4522-47cc-94a7-f46e072a2761 req-fcc773b7-f1c2-465d-b55d-13fbef3922d9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7a2c317e-5f8f-4337-9052-2231a2173f23" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.561 2 INFO nova.compute.manager [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Took 38.93 seconds to build instance.
Sep 30 21:16:26 compute-0 nova_compute[192810]: 2025-09-30 21:16:26.584 2 DEBUG oslo_concurrency.lockutils [None req-65a06e51-4f14-4038-b87f-160c570ae63e b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 39.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.664 220624 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.665 220624 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:26.665 220624 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.278 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[265606a4-3c55-422c-ac24-99c41c1e62bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.279 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap776029b5-61 in ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.281 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap776029b5-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.281 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fe11aaac-3a85-40e6-820d-4d9561f313a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.284 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e4b8cb-a294-40ec-af38-021cdbfb8abb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.316 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e75d11-ef75-4038-8b47-a1d61531e15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.334 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7964c65d-afc7-415d-b5c1-b1c49af238a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.336 103867 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpakdmp_9q/privsep.sock']
Sep 30 21:16:27 compute-0 nova_compute[192810]: 2025-09-30 21:16:27.918 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:16:27 compute-0 nova_compute[192810]: 2025-09-30 21:16:27.952 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:27 compute-0 nova_compute[192810]: 2025-09-30 21:16:27.953 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:27 compute-0 nova_compute[192810]: 2025-09-30 21:16:27.954 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.035 103867 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.037 103867 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpakdmp_9q/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.912 220638 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.920 220638 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.925 220638 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:27.925 220638 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220638
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.042 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8f31f29b-c291-4fb5-98ca-5d7917edeffb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.120 2 DEBUG nova.compute.manager [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.121 2 DEBUG oslo_concurrency.lockutils [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.122 2 DEBUG oslo_concurrency.lockutils [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.122 2 DEBUG oslo_concurrency.lockutils [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.122 2 DEBUG nova.compute.manager [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] No waiting events found dispatching network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:28 compute-0 nova_compute[192810]: 2025-09-30 21:16:28.123 2 WARNING nova.compute.manager [req-58e611f3-ab9a-497d-b31d-82db9fd0e189 req-999650c9-91b4-40c8-9b9e-fb8ae2c7ff7b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received unexpected event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b for instance with vm_state active and task_state None.
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.491 220638 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.491 220638 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:28.491 220638 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.045 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3e6fa03a-27e0-41cf-9c4b-1d8c91ab86b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.051 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f05c09a8-e839-42b6-92f0-69c30bf1c695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 NetworkManager[51733]: <info>  [1759266989.0549] manager: (tap776029b5-60): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.083 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[66b0ea07-1915-4bf3-a28d-11ad8cad8362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.085 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bd412058-53a8-4e5b-aa52-680638107a6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 systemd-udevd[220648]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:29 compute-0 NetworkManager[51733]: <info>  [1759266989.1153] device (tap776029b5-60): carrier: link connected
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.122 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[03da7d53-9ff6-4955-8cc8-d7e2eb95de65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.155 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d965cda5-6689-4283-85e9-2553ba49c49f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap776029b5-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:b6:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368473, 'reachable_time': 44966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220667, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.182 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a56972f2-70e8-4e2f-9eab-bd18029826fd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee3:b6bc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 368473, 'tstamp': 368473}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220668, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.204 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8bba416a-cd68-465c-9ddd-942cb550b817]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap776029b5-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e3:b6:bc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368473, 'reachable_time': 44966, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220669, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.243 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[086a52a2-1dd9-4b92-ad0d-f97b9d53a798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.308 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1ec530-0aef-4949-90f9-9e43f6b1365a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.310 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap776029b5-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.311 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.311 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap776029b5-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:29 compute-0 kernel: tap776029b5-60: entered promiscuous mode
Sep 30 21:16:29 compute-0 NetworkManager[51733]: <info>  [1759266989.3142] manager: (tap776029b5-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.319 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap776029b5-60, col_values=(('external_ids', {'iface-id': '8241e4b2-4ddf-4f99-a3a9-12824f39f1bf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:29 compute-0 ovn_controller[94912]: 2025-09-30T21:16:29Z|00031|binding|INFO|Releasing lport 8241e4b2-4ddf-4f99-a3a9-12824f39f1bf from this chassis (sb_readonly=0)
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.341 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.342 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[996915b3-e116-45d9-9baf-14dcd04d19a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.345 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/776029b5-6115-4a10-b114-56157e8d42e8.pid.haproxy
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 776029b5-6115-4a10-b114-56157e8d42e8
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:16:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:29.347 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'env', 'PROCESS_TAG=haproxy-776029b5-6115-4a10-b114-56157e8d42e8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/776029b5-6115-4a10-b114-56157e8d42e8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.721 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.744 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:29 compute-0 podman[220701]: 2025-09-30 21:16:29.7524934 +0000 UTC m=+0.063104546 container create 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.760 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.761 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating instance directory: /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.761 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Creating disk.info with the contents: {'/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk': 'qcow2', '/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.762 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.762 2 DEBUG nova.objects.instance [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.796 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:29 compute-0 systemd[1]: Started libpod-conmon-5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d.scope.
Sep 30 21:16:29 compute-0 podman[220701]: 2025-09-30 21:16:29.711629483 +0000 UTC m=+0.022240619 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:16:29 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:16:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a16715c916f2555ac80671ac8a0ad1f5df414838c61f3039fe04317094315acd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:16:29 compute-0 podman[220701]: 2025-09-30 21:16:29.853939177 +0000 UTC m=+0.164550403 container init 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:16:29 compute-0 podman[220701]: 2025-09-30 21:16:29.861005914 +0000 UTC m=+0.171617090 container start 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.863 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.865 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.866 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.882 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:29 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [NOTICE]   (220722) : New worker (220724) forked
Sep 30 21:16:29 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [NOTICE]   (220722) : Loading success.
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.954 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.956 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.994 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.995 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:29 compute-0 nova_compute[192810]: 2025-09-30 21:16:29.996 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.084 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.085 2 DEBUG nova.virt.disk.api [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Checking if we can resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.085 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.156 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.158 2 DEBUG nova.virt.disk.api [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Cannot resize image /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.158 2 DEBUG nova.objects.instance [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.179 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.208 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config 485376" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.210 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config to /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.210 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.711 2 DEBUG oslo_concurrency.processutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk.config /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.712 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.713 2 DEBUG nova.virt.libvirt.vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:16:20Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.713 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.714 2 DEBUG nova.network.os_vif_util [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.715 2 DEBUG os_vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.716 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.716 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.720 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70b5da71-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70b5da71-31, col_values=(('external_ids', {'iface-id': '70b5da71-314a-4c92-9db2-fb08b57a6736', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:31:8d', 'vm-uuid': '252d5457-8837-4aa6-b309-c3139e8db7ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:30 compute-0 NetworkManager[51733]: <info>  [1759266990.7355] manager: (tap70b5da71-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.745 2 INFO os_vif [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.746 2 DEBUG nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:16:30 compute-0 nova_compute[192810]: 2025-09-30 21:16:30.746 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:16:31 compute-0 nova_compute[192810]: 2025-09-30 21:16:31.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:32 compute-0 podman[220752]: 2025-09-30 21:16:32.353537091 +0000 UTC m=+0.076224485 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:16:32 compute-0 podman[220753]: 2025-09-30 21:16:32.359955542 +0000 UTC m=+0.083524989 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.expose-services=, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:16:32 compute-0 nova_compute[192810]: 2025-09-30 21:16:32.716 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:16:32 compute-0 nova_compute[192810]: 2025-09-30 21:16:32.728 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0tc_e70n',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:16:32 compute-0 nova_compute[192810]: 2025-09-30 21:16:32.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:32 compute-0 nova_compute[192810]: 2025-09-30 21:16:32.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:32 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 21:16:32 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 21:16:33 compute-0 kernel: tap70b5da71-31: entered promiscuous mode
Sep 30 21:16:33 compute-0 NetworkManager[51733]: <info>  [1759266993.1383] manager: (tap70b5da71-31): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Sep 30 21:16:33 compute-0 ovn_controller[94912]: 2025-09-30T21:16:33Z|00032|binding|INFO|Claiming lport 70b5da71-314a-4c92-9db2-fb08b57a6736 for this additional chassis.
Sep 30 21:16:33 compute-0 ovn_controller[94912]: 2025-09-30T21:16:33Z|00033|binding|INFO|70b5da71-314a-4c92-9db2-fb08b57a6736: Claiming fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:33 compute-0 systemd-udevd[220826]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:33 compute-0 NetworkManager[51733]: <info>  [1759266993.1870] device (tap70b5da71-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:16:33 compute-0 NetworkManager[51733]: <info>  [1759266993.1880] device (tap70b5da71-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:16:33 compute-0 systemd-machined[152794]: New machine qemu-2-instance-00000005.
Sep 30 21:16:33 compute-0 ovn_controller[94912]: 2025-09-30T21:16:33Z|00034|binding|INFO|Releasing lport 8241e4b2-4ddf-4f99-a3a9-12824f39f1bf from this chassis (sb_readonly=0)
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:33 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Sep 30 21:16:33 compute-0 ovn_controller[94912]: 2025-09-30T21:16:33Z|00035|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 ovn-installed in OVS
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.786 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.809 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.809 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.809 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.810 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.902 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.966 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:33 compute-0 nova_compute[192810]: 2025-09-30 21:16:33.967 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.031 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.041 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.102 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.103 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.174 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.337 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.339 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5609MB free_disk=73.46698379516602GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.339 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.339 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.424 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration for instance 252d5457-8837-4aa6-b309-c3139e8db7ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.461 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.461 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.461 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.461 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.462 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.464 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating resource usage from migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.464 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Starting to track incoming migration f4a68d88-31d7-4bd6-8c59-8f3d78cae759 with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.473 2 INFO nova.compute.manager [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Terminating instance
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.483 2 DEBUG nova.compute.manager [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:16:34 compute-0 kernel: tap15bad28b-18 (unregistering): left promiscuous mode
Sep 30 21:16:34 compute-0 NetworkManager[51733]: <info>  [1759266994.5127] device (tap15bad28b-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.529 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 7a2c317e-5f8f-4337-9052-2231a2173f23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.555 2 WARNING nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 252d5457-8837-4aa6-b309-c3139e8db7ed has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.555 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.556 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 ovn_controller[94912]: 2025-09-30T21:16:34Z|00036|binding|INFO|Releasing lport 15bad28b-1806-4dd5-8233-16a23f243c4b from this chassis (sb_readonly=0)
Sep 30 21:16:34 compute-0 ovn_controller[94912]: 2025-09-30T21:16:34Z|00037|binding|INFO|Setting lport 15bad28b-1806-4dd5-8233-16a23f243c4b down in Southbound
Sep 30 21:16:34 compute-0 ovn_controller[94912]: 2025-09-30T21:16:34Z|00038|binding|INFO|Removing iface tap15bad28b-18 ovn-installed in OVS
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.607 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:d1:20 10.1.0.10 fdfe:381f:8400::3e6'], port_security=['fa:16:3e:e7:d1:20 10.1.0.10 fdfe:381f:8400::3e6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.10/26 fdfe:381f:8400::3e6/64', 'neutron:device_id': '7a2c317e-5f8f-4337-9052-2231a2173f23', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-776029b5-6115-4a10-b114-56157e8d42e8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9ee15beb428a4c5e9726b65c80920da9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b51fed5f-83ee-4757-b171-916e34c6fdb5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ac973fb-5d64-4a91-8b3a-7dff1eb497e1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15bad28b-1806-4dd5-8233-16a23f243c4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.608 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15bad28b-1806-4dd5-8233-16a23f243c4b in datapath 776029b5-6115-4a10-b114-56157e8d42e8 unbound from our chassis
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.609 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 776029b5-6115-4a10-b114-56157e8d42e8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.610 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c6402551-d5ce-4209-91df-b742c9e25617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.611 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 namespace which is not needed anymore
Sep 30 21:16:34 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Sep 30 21:16:34 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 8.980s CPU time.
Sep 30 21:16:34 compute-0 systemd-machined[152794]: Machine qemu-1-instance-00000003 terminated.
Sep 30 21:16:34 compute-0 NetworkManager[51733]: <info>  [1759266994.7054] manager: (tap15bad28b-18): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.761 2 INFO nova.virt.libvirt.driver [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Instance destroyed successfully.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.762 2 DEBUG nova.objects.instance [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lazy-loading 'resources' on Instance uuid 7a2c317e-5f8f-4337-9052-2231a2173f23 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:34 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [NOTICE]   (220722) : haproxy version is 2.8.14-c23fe91
Sep 30 21:16:34 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [NOTICE]   (220722) : path to executable is /usr/sbin/haproxy
Sep 30 21:16:34 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [WARNING]  (220722) : Exiting Master process...
Sep 30 21:16:34 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [ALERT]    (220722) : Current worker (220724) exited with code 143 (Terminated)
Sep 30 21:16:34 compute-0 neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8[220715]: [WARNING]  (220722) : All workers exited. Exiting... (0)
Sep 30 21:16:34 compute-0 systemd[1]: libpod-5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d.scope: Deactivated successfully.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.788 2 DEBUG nova.virt.libvirt.vif [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:15:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-423040831-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-423040831-2',id=3,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2025-09-30T21:16:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9ee15beb428a4c5e9726b65c80920da9',ramdisk_id='',reservation_id='r-vbxtb26e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-649897189',owner_user_name='tempest-AutoAllocateNetworkTest-649897189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:26Z,user_data=None,user_id='b0c6f6dda88549aabf6a4b039406af8e',uuid=7a2c317e-5f8f-4337-9052-2231a2173f23,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.789 2 DEBUG nova.network.os_vif_util [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converting VIF {"id": "15bad28b-1806-4dd5-8233-16a23f243c4b", "address": "fa:16:3e:e7:d1:20", "network": {"id": "776029b5-6115-4a10-b114-56157e8d42e8", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400::3e6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.0/26", "dns": [], "gateway": {"address": "10.1.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9ee15beb428a4c5e9726b65c80920da9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15bad28b-18", "ovs_interfaceid": "15bad28b-1806-4dd5-8233-16a23f243c4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.790 2 DEBUG nova.network.os_vif_util [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.790 2 DEBUG os_vif [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.793 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15bad28b-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:34 compute-0 podman[220883]: 2025-09-30 21:16:34.794941953 +0000 UTC m=+0.058920201 container died 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.799 2 INFO os_vif [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:d1:20,bridge_name='br-int',has_traffic_filtering=True,id=15bad28b-1806-4dd5-8233-16a23f243c4b,network=Network(776029b5-6115-4a10-b114-56157e8d42e8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15bad28b-18')
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.800 2 INFO nova.virt.libvirt.driver [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Deleting instance files /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23_del
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.801 2 INFO nova.virt.libvirt.driver [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Deletion of /var/lib/nova/instances/7a2c317e-5f8f-4337-9052-2231a2173f23_del complete
Sep 30 21:16:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d-userdata-shm.mount: Deactivated successfully.
Sep 30 21:16:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-a16715c916f2555ac80671ac8a0ad1f5df414838c61f3039fe04317094315acd-merged.mount: Deactivated successfully.
Sep 30 21:16:34 compute-0 podman[220883]: 2025-09-30 21:16:34.83822606 +0000 UTC m=+0.102204298 container cleanup 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:16:34 compute-0 systemd[1]: libpod-conmon-5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d.scope: Deactivated successfully.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.874 2 DEBUG nova.virt.libvirt.host [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.874 2 INFO nova.virt.libvirt.host [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] UEFI support detected
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.877 2 INFO nova.compute.manager [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.877 2 DEBUG oslo.service.loopingcall [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.877 2 DEBUG nova.compute.manager [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.878 2 DEBUG nova.network.neutron [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:16:34 compute-0 podman[220922]: 2025-09-30 21:16:34.915831559 +0000 UTC m=+0.047486414 container remove 5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.924 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[01608092-613a-4e16-be69-47c2ba94eb39]: (4, ('Tue Sep 30 09:16:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 (5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d)\n5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d\nTue Sep 30 09:16:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 (5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d)\n5a137a628f18b06e88f491a2dea9786cf587663558dbbd0c687b55de1a6d866d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.926 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c65fb6d4-19f1-40f9-8755-bf5bd96ad8d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.927 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap776029b5-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:34 compute-0 kernel: tap776029b5-60: left promiscuous mode
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.936 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed792070-199b-4767-88ad-9c5f8371323d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.963 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:34 compute-0 nova_compute[192810]: 2025-09-30 21:16:34.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.997 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2573345e-8e2e-4794-af3b-116d850b730c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:34.998 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ad054067-4501-4eb4-86ed-069e1c6da477]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:35.021 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f92a42fc-cadc-4517-aa8f-3ce4d0e66a03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 368466, 'reachable_time': 39919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220937, 'error': None, 'target': 'ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.030 2 ERROR nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [req-bc415e40-80b2-40c8-8372-a92a26102bb8] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID fe423b93-de5a-41f7-97d1-9622ea46af54.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-bc415e40-80b2-40c8-8372-a92a26102bb8"}]}
Sep 30 21:16:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d776029b5\x2d6115\x2d4a10\x2db114\x2d56157e8d42e8.mount: Deactivated successfully.
Sep 30 21:16:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:35.035 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-776029b5-6115-4a10-b114-56157e8d42e8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:16:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:35.036 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[67237bde-fdf8-48b1-a2a8-84b0a4581a4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.057 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.080 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.081 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.100 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.112 2 DEBUG nova.compute.manager [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-unplugged-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.112 2 DEBUG oslo_concurrency.lockutils [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.113 2 DEBUG oslo_concurrency.lockutils [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.113 2 DEBUG oslo_concurrency.lockutils [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.113 2 DEBUG nova.compute.manager [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] No waiting events found dispatching network-vif-unplugged-15bad28b-1806-4dd5-8233-16a23f243c4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.113 2 DEBUG nova.compute.manager [req-cae57444-66ce-4433-9ce9-dfc1e94aa28d req-26a5a6d2-06cb-4f8b-a6e7-0e6f88302f64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-unplugged-15bad28b-1806-4dd5-8233-16a23f243c4b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.136 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.198 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.249 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updated inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.250 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.250 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.281 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.282 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.514 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759266995.5140345, 252d5457-8837-4aa6-b309-c3139e8db7ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.515 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Started (Lifecycle Event)
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.534 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:35.763 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:35.764 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.778 2 DEBUG nova.network.neutron [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.796 2 INFO nova.compute.manager [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Took 0.92 seconds to deallocate network for instance.
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.862 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.863 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.939 2 DEBUG nova.compute.provider_tree [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.967 2 DEBUG nova.scheduler.client.report [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:16:35 compute-0 nova_compute[192810]: 2025-09-30 21:16:35.994 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.019 2 INFO nova.scheduler.client.report [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Deleted allocations for instance 7a2c317e-5f8f-4337-9052-2231a2173f23
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.117 2 DEBUG oslo_concurrency.lockutils [None req-ca105e64-757a-4520-8ea3-ed7e9952f8f9 b0c6f6dda88549aabf6a4b039406af8e 9ee15beb428a4c5e9726b65c80920da9 - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.656s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.283 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759266996.2827063, 252d5457-8837-4aa6-b309-c3139e8db7ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.284 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Resumed (Lifecycle Event)
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.366 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.372 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.425 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Sep 30 21:16:36 compute-0 nova_compute[192810]: 2025-09-30 21:16:36.445 2 DEBUG nova.compute.manager [req-bd218d5a-1d09-41f2-9cc4-8606579d6257 req-b1274816-e8e0-4034-bd36-3eff3beb7621 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-deleted-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.283 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.284 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.284 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.301 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.302 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.302 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.442 2 DEBUG nova.compute.manager [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.443 2 DEBUG oslo_concurrency.lockutils [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.443 2 DEBUG oslo_concurrency.lockutils [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.443 2 DEBUG oslo_concurrency.lockutils [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7a2c317e-5f8f-4337-9052-2231a2173f23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.444 2 DEBUG nova.compute.manager [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] No waiting events found dispatching network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.444 2 WARNING nova.compute.manager [req-e47b04c5-6012-460f-ad0a-d06edad6b7b2 req-e39e04f2-8674-44f5-b587-843ad0be9e68 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Received unexpected event network-vif-plugged-15bad28b-1806-4dd5-8233-16a23f243c4b for instance with vm_state deleted and task_state None.
Sep 30 21:16:37 compute-0 ovn_controller[94912]: 2025-09-30T21:16:37Z|00039|binding|INFO|Claiming lport 70b5da71-314a-4c92-9db2-fb08b57a6736 for this chassis.
Sep 30 21:16:37 compute-0 ovn_controller[94912]: 2025-09-30T21:16:37Z|00040|binding|INFO|70b5da71-314a-4c92-9db2-fb08b57a6736: Claiming fa:16:3e:a9:31:8d 10.100.0.7
Sep 30 21:16:37 compute-0 ovn_controller[94912]: 2025-09-30T21:16:37Z|00041|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 up in Southbound
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.589 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '11', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.591 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 bound to our chassis
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.592 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.608 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a90671-e90a-4729-84f2-e4ac1d4f6624]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.609 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16d40025-11 in ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.611 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16d40025-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.611 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[993a67d5-7f5f-478f-9478-4d7f97300167]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.612 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[986555e2-b45d-47e3-8fdb-2052ea5e0549]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.626 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4910eea6-7afb-44e3-b1c0-bad756688851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.656 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2d8a6f2d-128f-4d5c-9f07-7f2721b35f1d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.693 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[726cbf59-62c2-4e58-8385-187ebd1489ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.699 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8f310e3b-25e3-486b-936d-86e682418e37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 NetworkManager[51733]: <info>  [1759266997.7010] manager: (tap16d40025-10): new Veth device (/org/freedesktop/NetworkManager/Devices/30)
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.741 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[89d0ef0b-4ff7-426d-b917-817c3e231657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.744 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bf406712-ee0f-4557-a276-107ed4f8c2f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 systemd-udevd[220958]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:16:37 compute-0 NetworkManager[51733]: <info>  [1759266997.7826] device (tap16d40025-10): carrier: link connected
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.780 2 INFO nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Post operation of migration started
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.788 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[61cff56b-e07b-4077-91dc-a1da902f6f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.810 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[279c4767-8c68-4a6a-9d0e-9c4c298191f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369340, 'reachable_time': 21125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220977, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.831 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[181d1aee-721b-4eb7-b551-2ad2e05819b1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:c752'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369340, 'tstamp': 369340}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220978, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.850 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[31f40493-2242-4ab1-863c-38e0fb80688a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369340, 'reachable_time': 21125, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220979, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.888 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6350989d-9fa6-45f2-aeb5-7ea1dff623b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.969 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6abd0d21-f520-47da-9714-93c7bb188213]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.971 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.971 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.972 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-0 kernel: tap16d40025-10: entered promiscuous mode
Sep 30 21:16:37 compute-0 NetworkManager[51733]: <info>  [1759266997.9753] manager: (tap16d40025-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.978 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-0 ovn_controller[94912]: 2025-09-30T21:16:37Z|00042|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.982 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.982 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a815b81b-69ab-44e5-829d-7ee7dd29b933]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.983 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:16:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:37.984 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'env', 'PROCESS_TAG=haproxy-16d40025-1087-460f-a42f-c007f6eff406', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16d40025-1087-460f-a42f-c007f6eff406.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:16:37 compute-0 nova_compute[192810]: 2025-09-30 21:16:37.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:38 compute-0 nova_compute[192810]: 2025-09-30 21:16:38.356 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:38 compute-0 nova_compute[192810]: 2025-09-30 21:16:38.356 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:38 compute-0 nova_compute[192810]: 2025-09-30 21:16:38.357 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:16:38 compute-0 podman[221012]: 2025-09-30 21:16:38.403927871 +0000 UTC m=+0.050174051 container create 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:16:38 compute-0 systemd[1]: Started libpod-conmon-0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1.scope.
Sep 30 21:16:38 compute-0 podman[221012]: 2025-09-30 21:16:38.377088447 +0000 UTC m=+0.023334677 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:16:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:16:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/857085b286bbb49661aa40b4c048c9cff2a47263f6ac7f92de8a3e4a109a23bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:16:38 compute-0 podman[221012]: 2025-09-30 21:16:38.500284081 +0000 UTC m=+0.146530291 container init 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:16:38 compute-0 podman[221012]: 2025-09-30 21:16:38.521011342 +0000 UTC m=+0.167257562 container start 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:16:38 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [NOTICE]   (221032) : New worker (221034) forked
Sep 30 21:16:38 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [NOTICE]   (221032) : Loading success.
Sep 30 21:16:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:38.720 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:38.723 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:38.725 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:39 compute-0 nova_compute[192810]: 2025-09-30 21:16:39.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:40 compute-0 podman[221044]: 2025-09-30 21:16:40.354143775 +0000 UTC m=+0.075205190 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.403 2 DEBUG nova.network.neutron [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.429 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.456 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.457 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.457 2 DEBUG oslo_concurrency.lockutils [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:40 compute-0 nova_compute[192810]: 2025-09-30 21:16:40.461 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:16:40 compute-0 virtqemud[192233]: Domain id=2 name='instance-00000005' uuid=252d5457-8837-4aa6-b309-c3139e8db7ed is tainted: custom-monitor
Sep 30 21:16:41 compute-0 nova_compute[192810]: 2025-09-30 21:16:41.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:41 compute-0 nova_compute[192810]: 2025-09-30 21:16:41.469 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:16:42 compute-0 nova_compute[192810]: 2025-09-30 21:16:42.476 2 INFO nova.virt.libvirt.driver [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:16:42 compute-0 nova_compute[192810]: 2025-09-30 21:16:42.483 2 DEBUG nova.compute.manager [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:42 compute-0 nova_compute[192810]: 2025-09-30 21:16:42.505 2 DEBUG nova.objects.instance [None req-4b0d1c91-edae-4139-941a-cc287f092544 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:16:43 compute-0 ovn_controller[94912]: 2025-09-30T21:16:43Z|00043|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:16:43 compute-0 nova_compute[192810]: 2025-09-30 21:16:43.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.289 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e6309f24172f54505a0b6dbe6ed21c4bbd020297a97a785f3f045d2e190213b5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.560 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 30 Sep 2025 21:16:44 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-70ed3338-eec2-45e1-851f-2529a5d5448b x-openstack-request-id: req-70ed3338-eec2-45e1-851f-2529a5d5448b _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.560 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}, {"id": "c9779bca-1eb6-4567-a36c-b452abeafc70", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c9779bca-1eb6-4567-a36c-b452abeafc70"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.561 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-70ed3338-eec2-45e1-851f-2529a5d5448b request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.562 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}e6309f24172f54505a0b6dbe6ed21c4bbd020297a97a785f3f045d2e190213b5" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.671 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Tue, 30 Sep 2025 21:16:44 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-07b28b1f-fdd3-4743-a492-2572610636e0 x-openstack-request-id: req-07b28b1f-fdd3-4743-a492-2572610636e0 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.671 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "afe5c12d-500a-499b-9438-9e9c37698acc", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/afe5c12d-500a-499b-9438-9e9c37698acc"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.671 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/afe5c12d-500a-499b-9438-9e9c37698acc used request id req-07b28b1f-fdd3-4743-a492-2572610636e0 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.672 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'name': 'tempest-LiveMigrationTest-server-1363935032', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '96460712956e4f038121397afa979163', 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'hostId': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.673 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.677 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 252d5457-8837-4aa6-b309-c3139e8db7ed / tap70b5da71-31 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.677 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dea0de90-91ea-458a-b390-a38159e019a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.673444', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c44b5464-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '3427caa6055969d6ec2f2b8f6952f874d1067ae93d50174ce5daf2f1c848e780'}]}, 'timestamp': '2025-09-30 21:16:44.678956', '_unique_id': '2b8cbf3476e5404a892728ce8e481ef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.684 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.687 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.687 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.outgoing.bytes volume: 5574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56fe8798-74c0-4ce4-b985-a10002757b64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5574, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.687480', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c44cb5a2-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '0b09eea07674cc71d3aa125389779131d4fa5b9762f6ec6e359c2830da87fb3a'}]}, 'timestamp': '2025-09-30 21:16:44.687922', '_unique_id': '409c1af2e2734a4286dbf3ef83197fa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.688 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.689 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.689 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc98f24b-f615-4261-aba6-a23841c15f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.689271', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c44cf8d2-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '0f90c8d9670b8da2b82ac53f559abb32a1b6517f028a1734c4aaecfbbdbed5e9'}]}, 'timestamp': '2025-09-30 21:16:44.689610', '_unique_id': 'a323496fab7d4df9bb1aa9163c37611c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.690 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e354611e-a84f-4db1-8e1d-86f87efe3507', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.690734', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c44d3194-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': 'ff5f952f6ea5007dd2e65a616fdfbbb33d2bb6ddce517ed82801c4c10edab50b'}]}, 'timestamp': '2025-09-30 21:16:44.691036', '_unique_id': 'c73146a70b6e4fd7a29f63d855fcd83b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.691 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.692 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.710 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.711 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5e4d467-5233-45c5-8baa-59b544ea470e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.692138', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c45046fe-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '34ab5f01530a9027bbfe30e86f5455c1c731f07b201d91310514a6dba384ebb1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.692138', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c450543c-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '23fa153f68d626f293fe6e3209145f475b9699e9a9a733b022b30ff08d4db593'}]}, 'timestamp': '2025-09-30 21:16:44.711676', '_unique_id': 'b89dc1e110524f5caabe42bb1c3e764e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.712 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.713 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.713 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.713 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '687e9ebb-397c-4a3c-956b-61eca82e62e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.713675', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c450b238-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '9c6ddafe343a9a9da01a82d0aa0ea621cc9609ff480f7660d578c2e003c1362f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.713675', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c450bdc8-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '5349c7c61074f2fabbc4d2c00696d69845ce8326c49e912699c5fd9d576ca4ea'}]}, 'timestamp': '2025-09-30 21:16:44.714511', '_unique_id': '3e878f254d7c4f39aedee9daa78a38d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.715 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.716 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.735 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/cpu volume: 90000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d224369-d92e-4839-86fb-2d96c8a5e6a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 90000000, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'timestamp': '2025-09-30T21:16:44.716215', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'c4541d6a-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.422883341, 'message_signature': '144e9acdd4a7b62d31f40aea99224146d9d574df6597220cd075c3ba13f316c2'}]}, 'timestamp': '2025-09-30 21:16:44.736494', '_unique_id': '1d10dc31b0434eedb8a537b4d09963ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.737 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.738 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.751 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.752 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6db39ac-e444-4bd9-a725-5013d35c7ff8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.738848', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4569522-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': 'f9b3ba6bfac4b1cb2061828a46db1113900fccf859adb0eb6ae29cb922cdd88d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.738848', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c456a83c-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': '7ae422625dc0069b3d553c2118e9e0ca738618eccdb84e6046763def7cc750c3'}]}, 'timestamp': '2025-09-30 21:16:44.753073', '_unique_id': '1445d2e904674ef0b022febaef4af29d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.754 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.755 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.latency volume: 4096922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.756 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c35199e-fcc4-4b52-a223-5add3bf240d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4096922, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.755844', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c45724b0-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '1251702252823625bc7be115ea6290513ab4154f0a489ccd99d8c7cfd5b05412'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.755844', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c457359a-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '4b4e7cd1fce1769238fee9612b19d6e27a0c8fb8dd5c43abe8b0b5fd14212036'}]}, 'timestamp': '2025-09-30 21:16:44.756759', '_unique_id': 'f1e48fbf35554749b2ba4a0f9a90f98c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.757 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.758 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.759 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/memory.usage volume: 40.41796875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d53d9a9-a5d6-4034-a584-3dd2ca3bd6d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41796875, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'timestamp': '2025-09-30T21:16:44.759081', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'c457a30e-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.422883341, 'message_signature': '3142905d8b79995ef190d159395db832d4fb360c10811175a9d14ce85c2b9fba'}]}, 'timestamp': '2025-09-30 21:16:44.759539', '_unique_id': '277550eb1e39476c850cb55acb6221d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.760 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.761 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.761 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.761 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>]
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.762 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '826f0abe-c37c-4d22-8b85-32dde505cbf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.762787', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c4583404-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '2b52d6a01eba361deb3a37ee52d156931139c11a03a8b435cf06d0a19a802495'}]}, 'timestamp': '2025-09-30 21:16:44.763265', '_unique_id': 'd7c92276824d443ea01b654b8481f066'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.764 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.765 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.766 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:44.766 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0da96d8f-5906-4135-971d-0b27ccfc6104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.765579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c458c518-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': 'aad7ca907813e8089795f338ef99e0c934719479332be7b6c860aeed84418cd3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.765579', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c458d224-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '2d2e3314fb175281944323efb86e9892bd069a4fccf559404bbaa2b90254c419'}]}, 'timestamp': '2025-09-30 21:16:44.767222', '_unique_id': 'e25a649dca4c499e978eed44a7f633aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.767 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.768 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.768 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.incoming.bytes volume: 922 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11714e4b-7fd4-40b8-b713-31d75f152dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 922, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.768668', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c45916bc-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': 'fcaa827ddfa8711b8db9e1c128c9dd5e86ad2ac2ec830cb9a92eb5873b41b7fd'}]}, 'timestamp': '2025-09-30 21:16:44.768988', '_unique_id': 'fffc1d809505466c88eae46caf910648'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.769 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>]
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.770 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a955e14-e498-4947-8fce-90c8e05e2930', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.770483', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4595d70-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': '52e57f49f68e7a28ea428349f3cca663b7c62e0e89f318839d82b5d8ae4be47f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.770483', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c459663a-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': '288a12595345b464262eab98267638eb884b65354c1bfa30428c204da2d93322'}]}, 'timestamp': '2025-09-30 21:16:44.770983', '_unique_id': '6a1cda7a6b574157b231b9c2adcd4146'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.771 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.772 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.772 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27dc837f-a96d-4cb3-a992-d507f7cb7eab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.772070', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c4599a7e-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '2dec59722e1fe1f7a81e0fe9be6b2cd73c32a9574ec3499367a5847d361a2d79'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.772070', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c459a4ce-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '78cfe7d0f5e807c59a62688e9070857816660aa8c4a42f4b1889136cbc21197c'}]}, 'timestamp': '2025-09-30 21:16:44.772593', '_unique_id': 'f60de3e5a45c480fb63951854892110d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.773 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa958f0b-c8a7-4843-9e5a-f8d6d00e5300', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.773712', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c459daca-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': '3f3f43c6e2998c90428b0d4b7f841f6d6f8fe6bb702954707506aee2ce98e126'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.773712', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c459e4fc-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.426455981, 'message_signature': 'e45bbf70cc46e6c65c7d4f1a4ce25f8ed6e5eec0adaeed1731f52d8383460da0'}]}, 'timestamp': '2025-09-30 21:16:44.774221', '_unique_id': 'f58ed37a2fae4e18921c3f5d8d996b4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.774 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.775 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a552d5b-bcac-4bac-ab4c-dc3bcc37b239', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.775309', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c45a194a-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '505305de0cac8f3f54fdbc76e8bbf93537376651d1d5251f9d101d2058e58128'}]}, 'timestamp': '2025-09-30 21:16:44.775620', '_unique_id': '52428ce5562b4eea93fdc7d61908646e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.776 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.bytes volume: 151552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd26a98ec-a4a2-44a7-842e-e6061bf147a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 151552, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-vda', 'timestamp': '2025-09-30T21:16:44.776752', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'c45a5158-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': '2f1caf6a20877688049ce9e3656d0a0518a312cdfba324f30e4b7e6d4334ba1b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': '252d5457-8837-4aa6-b309-c3139e8db7ed-sda', 'timestamp': '2025-09-30T21:16:44.776752', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'instance-00000005', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'c45a5b8a-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.379781859, 'message_signature': 'b466acaa781182e588573b23a1a6989ed21b43f236910d71582587c0e7a6466a'}]}, 'timestamp': '2025-09-30 21:16:44.777277', '_unique_id': 'f539cf11a93a4b02aa66aab90a20bf7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.777 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.778 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.778 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cb2d9c0-307e-4f76-8367-cd08a6ff4124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.778410', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c45a935c-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '2696e7767e3d65658baea4bdc5489482f9732ea7be7befc6f275f5c252ed0b92'}]}, 'timestamp': '2025-09-30 21:16:44.778738', '_unique_id': 'c16cbd4cbe5046989a43a171fb69392f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.779 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.780 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>]
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.780 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59322907-54b3-4937-8668-26df51293ae1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.780338', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c45adde4-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': 'caff23f1bb533e1e78969958856d1306a61cde0bc2d7847578e0a96fc9503b9a'}]}, 'timestamp': '2025-09-30 21:16:44.780644', '_unique_id': '6ae146d6f847450295691d41c1099548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.781 12 DEBUG ceilometer.compute.pollsters [-] 252d5457-8837-4aa6-b309-c3139e8db7ed/network.outgoing.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d556cba-2bc2-4392-b953-dd1cdf5f6ef0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_name': None, 'project_id': '96460712956e4f038121397afa979163', 'project_name': None, 'resource_id': 'instance-00000005-252d5457-8837-4aa6-b309-c3139e8db7ed-tap70b5da71-31', 'timestamp': '2025-09-30T21:16:44.781881', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1363935032', 'name': 'tap70b5da71-31', 'instance_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'instance_type': 'm1.nano', 'host': '4f87e8a0d73e5fda76fbc0cb6c3062e9fd11863d1faa71bc9f9eb8f0', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a9:31:8d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap70b5da71-31'}, 'message_id': 'c45b19e4-9e42-11f0-a153-fa163e09b122', 'monotonic_time': 3700.361088729, 'message_signature': '10b018ee791ef45dcac2f9f0b853febd123feb6dba5f92bb926f5e9b8e2d4327'}]}, 'timestamp': '2025-09-30 21:16:44.782212', '_unique_id': '6f1a6746381e40b7b553b0161309aff4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.782 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.783 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:16:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:16:44.783 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1363935032>]
Sep 30 21:16:44 compute-0 nova_compute[192810]: 2025-09-30 21:16:44.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:45 compute-0 podman[221065]: 2025-09-30 21:16:45.330512138 +0000 UTC m=+0.059823873 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:16:45 compute-0 podman[221064]: 2025-09-30 21:16:45.331666597 +0000 UTC m=+0.063520676 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.469 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Check if temp file /var/lib/nova/instances/tmpfa4_vdvr exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.476 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.574 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.575 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.657 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:46 compute-0 nova_compute[192810]: 2025-09-30 21:16:46.659 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:16:47 compute-0 nova_compute[192810]: 2025-09-30 21:16:47.986 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:48 compute-0 nova_compute[192810]: 2025-09-30 21:16:48.090 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:48 compute-0 nova_compute[192810]: 2025-09-30 21:16:48.092 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:16:48 compute-0 nova_compute[192810]: 2025-09-30 21:16:48.156 2 DEBUG oslo_concurrency.processutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:16:49 compute-0 nova_compute[192810]: 2025-09-30 21:16:49.758 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759266994.7560122, 7a2c317e-5f8f-4337-9052-2231a2173f23 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:49 compute-0 nova_compute[192810]: 2025-09-30 21:16:49.759 2 INFO nova.compute.manager [-] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] VM Stopped (Lifecycle Event)
Sep 30 21:16:49 compute-0 nova_compute[192810]: 2025-09-30 21:16:49.784 2 DEBUG nova.compute.manager [None req-4637850b-b5eb-4911-b5aa-0d44765ef219 - - - - - -] [instance: 7a2c317e-5f8f-4337-9052-2231a2173f23] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:49 compute-0 nova_compute[192810]: 2025-09-30 21:16:49.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:50 compute-0 podman[221120]: 2025-09-30 21:16:50.368666704 +0000 UTC m=+0.109159203 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:16:51 compute-0 nova_compute[192810]: 2025-09-30 21:16:51.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:51 compute-0 sshd-session[221148]: Accepted publickey for nova from 192.168.122.101 port 39772 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:16:51 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:16:51 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:16:51 compute-0 systemd-logind[792]: New session 28 of user nova.
Sep 30 21:16:51 compute-0 podman[221150]: 2025-09-30 21:16:51.614758293 +0000 UTC m=+0.068076951 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:16:51 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:16:51 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:16:51 compute-0 systemd[221172]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:16:51 compute-0 systemd[221172]: Queued start job for default target Main User Target.
Sep 30 21:16:51 compute-0 systemd[221172]: Created slice User Application Slice.
Sep 30 21:16:51 compute-0 systemd[221172]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:16:51 compute-0 systemd[221172]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:16:51 compute-0 systemd[221172]: Reached target Paths.
Sep 30 21:16:51 compute-0 systemd[221172]: Reached target Timers.
Sep 30 21:16:51 compute-0 systemd[221172]: Starting D-Bus User Message Bus Socket...
Sep 30 21:16:51 compute-0 systemd[221172]: Starting Create User's Volatile Files and Directories...
Sep 30 21:16:51 compute-0 systemd[221172]: Finished Create User's Volatile Files and Directories.
Sep 30 21:16:51 compute-0 systemd[221172]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:16:51 compute-0 systemd[221172]: Reached target Sockets.
Sep 30 21:16:51 compute-0 systemd[221172]: Reached target Basic System.
Sep 30 21:16:51 compute-0 systemd[221172]: Reached target Main User Target.
Sep 30 21:16:51 compute-0 systemd[221172]: Startup finished in 167ms.
Sep 30 21:16:51 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:16:51 compute-0 systemd[1]: Started Session 28 of User nova.
Sep 30 21:16:51 compute-0 sshd-session[221148]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:16:51 compute-0 sshd-session[221188]: Received disconnect from 192.168.122.101 port 39772:11: disconnected by user
Sep 30 21:16:51 compute-0 sshd-session[221188]: Disconnected from user nova 192.168.122.101 port 39772
Sep 30 21:16:51 compute-0 sshd-session[221148]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:16:51 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Sep 30 21:16:51 compute-0 systemd-logind[792]: Session 28 logged out. Waiting for processes to exit.
Sep 30 21:16:51 compute-0 systemd-logind[792]: Removed session 28.
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.664 2 DEBUG nova.compute.manager [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.665 2 DEBUG oslo_concurrency.lockutils [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.666 2 DEBUG oslo_concurrency.lockutils [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.666 2 DEBUG oslo_concurrency.lockutils [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.666 2 DEBUG nova.compute.manager [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:53 compute-0 nova_compute[192810]: 2025-09-30 21:16:53.666 2 DEBUG nova.compute.manager [req-069906ee-6319-4dd1-957d-62dde73c2281 req-bee412b7-88e0-4440-af2b-a213baf3ac39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.045 2 INFO nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Took 5.89 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.046 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.063 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfa4_vdvr',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='252d5457-8837-4aa6-b309-c3139e8db7ed',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(d9519eac-2da2-481a-8fef-10150b102ff6),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.087 2 DEBUG nova.objects.instance [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid 252d5457-8837-4aa6-b309-c3139e8db7ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.089 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.092 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.093 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.111 2 DEBUG nova.virt.libvirt.vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:42Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.112 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.114 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.116 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:16:54 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:a9:31:8d"/>
Sep 30 21:16:54 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:16:54 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:16:54 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:16:54 compute-0 nova_compute[192810]:   <target dev="tap70b5da71-31"/>
Sep 30 21:16:54 compute-0 nova_compute[192810]: </interface>
Sep 30 21:16:54 compute-0 nova_compute[192810]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.117 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.596 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.597 2 INFO nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.698 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:16:54 compute-0 nova_compute[192810]: 2025-09-30 21:16:54.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.202 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.203 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.707 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.708 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.799 2 DEBUG nova.compute.manager [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.799 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.800 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.800 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.800 2 DEBUG nova.compute.manager [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.801 2 WARNING nova.compute.manager [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.801 2 DEBUG nova.compute.manager [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.801 2 DEBUG nova.compute.manager [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing instance network info cache due to event network-changed-70b5da71-314a-4c92-9db2-fb08b57a6736. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.802 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.802 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:16:55 compute-0 nova_compute[192810]: 2025-09-30 21:16:55.802 2 DEBUG nova.network.neutron [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Refreshing network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.211 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.212 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:56 compute-0 podman[221200]: 2025-09-30 21:16:56.36869027 +0000 UTC m=+0.088533835 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.716 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267016.7164652, 252d5457-8837-4aa6-b309-c3139e8db7ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.717 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Paused (Lifecycle Event)
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.721 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.721 2 DEBUG nova.virt.libvirt.migration [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.740 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.745 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.761 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:16:56 compute-0 kernel: tap70b5da71-31 (unregistering): left promiscuous mode
Sep 30 21:16:56 compute-0 NetworkManager[51733]: <info>  [1759267016.8578] device (tap70b5da71-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:16:56 compute-0 ovn_controller[94912]: 2025-09-30T21:16:56Z|00044|binding|INFO|Releasing lport 70b5da71-314a-4c92-9db2-fb08b57a6736 from this chassis (sb_readonly=0)
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:56 compute-0 ovn_controller[94912]: 2025-09-30T21:16:56Z|00045|binding|INFO|Setting lport 70b5da71-314a-4c92-9db2-fb08b57a6736 down in Southbound
Sep 30 21:16:56 compute-0 ovn_controller[94912]: 2025-09-30T21:16:56Z|00046|binding|INFO|Removing iface tap70b5da71-31 ovn-installed in OVS
Sep 30 21:16:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:56.885 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:31:8d 10.100.0.7'], port_security=['fa:16:3e:a9:31:8d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '252d5457-8837-4aa6-b309-c3139e8db7ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '18', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=70b5da71-314a-4c92-9db2-fb08b57a6736) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:16:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:56.887 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 70b5da71-314a-4c92-9db2-fb08b57a6736 in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:16:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:56.889 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:16:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:56.891 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0708d7-beb1-44ae-9c0e-174a2c977106]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:56.892 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace which is not needed anymore
Sep 30 21:16:56 compute-0 nova_compute[192810]: 2025-09-30 21:16:56.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:56 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Sep 30 21:16:56 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 4.061s CPU time.
Sep 30 21:16:56 compute-0 systemd-machined[152794]: Machine qemu-2-instance-00000005 terminated.
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [NOTICE]   (221032) : haproxy version is 2.8.14-c23fe91
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [NOTICE]   (221032) : path to executable is /usr/sbin/haproxy
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [WARNING]  (221032) : Exiting Master process...
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [WARNING]  (221032) : Exiting Master process...
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [ALERT]    (221032) : Current worker (221034) exited with code 143 (Terminated)
Sep 30 21:16:57 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221028]: [WARNING]  (221032) : All workers exited. Exiting... (0)
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.110 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.110 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.111 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:16:57 compute-0 systemd[1]: libpod-0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1.scope: Deactivated successfully.
Sep 30 21:16:57 compute-0 podman[221246]: 2025-09-30 21:16:57.118675077 +0000 UTC m=+0.067810855 container died 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:16:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1-userdata-shm.mount: Deactivated successfully.
Sep 30 21:16:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-857085b286bbb49661aa40b4c048c9cff2a47263f6ac7f92de8a3e4a109a23bb-merged.mount: Deactivated successfully.
Sep 30 21:16:57 compute-0 podman[221246]: 2025-09-30 21:16:57.162487197 +0000 UTC m=+0.111622905 container cleanup 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:16:57 compute-0 systemd[1]: libpod-conmon-0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1.scope: Deactivated successfully.
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.225 2 DEBUG nova.virt.libvirt.guest [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '252d5457-8837-4aa6-b309-c3139e8db7ed' (instance-00000005) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.226 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migration operation has completed
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.226 2 INFO nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] _post_live_migration() is started..
Sep 30 21:16:57 compute-0 podman[221293]: 2025-09-30 21:16:57.249734129 +0000 UTC m=+0.054461249 container remove 0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.256 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dbbee6d3-5173-4a3e-a8b0-58b28b1877f3]: (4, ('Tue Sep 30 09:16:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1)\n0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1\nTue Sep 30 09:16:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1)\n0c29e8c90c09790ee5a5922950e1c545d41b62569f16cad1f42c4416aa8ac5a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.258 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[06fdfd4e-6065-4007-af4a-bc1e6716fddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.260 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:57 compute-0 kernel: tap16d40025-10: left promiscuous mode
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.282 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6508218f-58fe-41fc-866b-a31e9de2dbd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.320 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89932a5e-b296-4528-8025-c57e7c22c86d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.321 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e78ab26b-e114-408b-a917-6095b9986dd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.340 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8fff9014-d697-4ab8-8ff0-fa1458167d04]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369330, 'reachable_time': 30174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221312, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d16d40025\x2d1087\x2d460f\x2da42f\x2dc007f6eff406.mount: Deactivated successfully.
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.346 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:16:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:16:57.346 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[85b39fd4-a3c0-4674-8ae0-e9e4f910a91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.725 2 DEBUG nova.network.neutron [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updated VIF entry in instance network info cache for port 70b5da71-314a-4c92-9db2-fb08b57a6736. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.726 2 DEBUG nova.network.neutron [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Updating instance_info_cache with network_info: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true, "migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.758 2 DEBUG oslo_concurrency.lockutils [req-3bacafdc-ea26-4815-b914-1403af4fc2f6 req-edf8d8a2-89ae-4d08-9e2c-9b345db7a4f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-252d5457-8837-4aa6-b309-c3139e8db7ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:16:57 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.999 2 DEBUG nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:57.999 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.000 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.000 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.000 2 DEBUG nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.001 2 DEBUG nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.001 2 DEBUG nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.001 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.002 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.002 2 DEBUG oslo_concurrency.lockutils [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.003 2 DEBUG nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.003 2 WARNING nova.compute.manager [req-24ff9b23-7c56-414c-a877-9e307adc5290 req-f35e9437-2887-466e-b713-0b1da813a40e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.087 2 DEBUG nova.network.neutron [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Activated binding for port 70b5da71-314a-4c92-9db2-fb08b57a6736 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.088 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.089 2 DEBUG nova.virt.libvirt.vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:16:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1363935032',display_name='tempest-LiveMigrationTest-server-1363935032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-1363935032',id=5,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:16:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-0x92j8qn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:16:45Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=252d5457-8837-4aa6-b309-c3139e8db7ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.089 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "70b5da71-314a-4c92-9db2-fb08b57a6736", "address": "fa:16:3e:a9:31:8d", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70b5da71-31", "ovs_interfaceid": "70b5da71-314a-4c92-9db2-fb08b57a6736", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.090 2 DEBUG nova.network.os_vif_util [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.091 2 DEBUG os_vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.094 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70b5da71-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.103 2 INFO os_vif [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:31:8d,bridge_name='br-int',has_traffic_filtering=True,id=70b5da71-314a-4c92-9db2-fb08b57a6736,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70b5da71-31')
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.104 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.104 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.105 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.105 2 DEBUG nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.106 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deleting instance files /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del
Sep 30 21:16:58 compute-0 nova_compute[192810]: 2025-09-30 21:16:58.107 2 INFO nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Deletion of /var/lib/nova/instances/252d5457-8837-4aa6-b309-c3139e8db7ed_del complete
Sep 30 21:16:59 compute-0 unix_chkpwd[221315]: password check failed for user (root)
Sep 30 21:16:59 compute-0 sshd-session[221313]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.144 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.144 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.144 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.145 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.145 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.145 2 WARNING nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.146 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-unplugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.147 2 WARNING nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 DEBUG oslo_concurrency.lockutils [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 DEBUG nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] No waiting events found dispatching network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:00 compute-0 nova_compute[192810]: 2025-09-30 21:17:00.148 2 WARNING nova.compute.manager [req-a746aa85-fc6b-4745-9cd4-1f16af758ede req-79015ed5-2885-4a6c-ba99-fce6dfe04507 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Received unexpected event network-vif-plugged-70b5da71-314a-4c92-9db2-fb08b57a6736 for instance with vm_state active and task_state migrating.
Sep 30 21:17:00 compute-0 sshd-session[221313]: Failed password for root from 45.81.23.80 port 47726 ssh2
Sep 30 21:17:01 compute-0 sshd-session[221313]: Received disconnect from 45.81.23.80 port 47726:11: Bye Bye [preauth]
Sep 30 21:17:01 compute-0 sshd-session[221313]: Disconnected from authenticating user root 45.81.23.80 port 47726 [preauth]
Sep 30 21:17:01 compute-0 nova_compute[192810]: 2025-09-30 21:17:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:01 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:17:01 compute-0 systemd[221172]: Activating special unit Exit the Session...
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped target Main User Target.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped target Basic System.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped target Paths.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped target Sockets.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped target Timers.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:01 compute-0 systemd[221172]: Closed D-Bus User Message Bus Socket.
Sep 30 21:17:01 compute-0 systemd[221172]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:17:01 compute-0 systemd[221172]: Removed slice User Application Slice.
Sep 30 21:17:01 compute-0 systemd[221172]: Reached target Shutdown.
Sep 30 21:17:01 compute-0 systemd[221172]: Finished Exit the Session.
Sep 30 21:17:01 compute-0 systemd[221172]: Reached target Exit the Session.
Sep 30 21:17:02 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:17:02 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:17:02 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:17:02 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:17:02 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:17:02 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:17:02 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.018 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.020 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.040 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.170 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.170 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.180 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.181 2 INFO nova.compute.claims [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:17:03 compute-0 podman[221317]: 2025-09-30 21:17:03.347513779 +0000 UTC m=+0.079196690 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.345 2 DEBUG nova.compute.provider_tree [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:03 compute-0 podman[221318]: 2025-09-30 21:17:03.3539407 +0000 UTC m=+0.083279272 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, vcs-type=git)
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.361 2 DEBUG nova.scheduler.client.report [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.390 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.403 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.403 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.428 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] No node specified, defaulting to compute-0.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.468 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "eb3c2680-7766-4b45-bf3b-d565d1ab9164" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.469 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.547 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.547 2 DEBUG nova.network.neutron [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.668 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.678 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.679 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.680 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "252d5457-8837-4aa6-b309-c3139e8db7ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.710 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.717 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.718 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.718 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.719 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.827 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.829 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.830 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Creating image(s)
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.831 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.832 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.833 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.852 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.958 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.107s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.959 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.960 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:03 compute-0 nova_compute[192810]: 2025-09-30 21:17:03.970 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.036 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.037 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.060 2 WARNING nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.062 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5789MB free_disk=73.46663665771484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.062 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.063 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.074 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.074 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.075 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.131 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.133 2 DEBUG nova.virt.disk.api [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Checking if we can resize image /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.133 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.162 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration for instance 252d5457-8837-4aa6-b309-c3139e8db7ed refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.194 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.195 2 DEBUG nova.virt.disk.api [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Cannot resize image /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.196 2 DEBUG nova.objects.instance [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'migration_context' on Instance uuid a99700e2-8d2c-4da5-a64d-faee03f5c83d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.203 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.216 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.216 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Ensure instance console log exists: /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.217 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.218 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.218 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.234 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration d9519eac-2da2-481a-8fef-10150b102ff6 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.235 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Instance a99700e2-8d2c-4da5-a64d-faee03f5c83d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.235 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.236 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.296 2 DEBUG nova.compute.provider_tree [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.299 2 DEBUG nova.network.neutron [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.299 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.302 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.306 2 WARNING nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.310 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.311 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.313 2 DEBUG nova.scheduler.client.report [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.322 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.323 2 DEBUG nova.virt.libvirt.host [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.325 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.325 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.326 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.326 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.326 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.326 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.327 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.327 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.327 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.328 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.328 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.328 2 DEBUG nova.virt.hardware [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.333 2 DEBUG nova.objects.instance [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'pci_devices' on Instance uuid a99700e2-8d2c-4da5-a64d-faee03f5c83d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.343 2 DEBUG nova.compute.resource_tracker [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.343 2 DEBUG oslo_concurrency.lockutils [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.349 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <uuid>a99700e2-8d2c-4da5-a64d-faee03f5c83d</uuid>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <name>instance-00000009</name>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersOnMultiNodesTest-server-1032185460-1</nova:name>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:17:04</nova:creationTime>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:user uuid="a283310a99174bb794a56e8355b40a03">tempest-ServersOnMultiNodesTest-102021708-project-member</nova:user>
Sep 30 21:17:04 compute-0 nova_compute[192810]:         <nova:project uuid="0d752ea56b394666bd18bda096b07530">tempest-ServersOnMultiNodesTest-102021708</nova:project>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <system>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="serial">a99700e2-8d2c-4da5-a64d-faee03f5c83d</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="uuid">a99700e2-8d2c-4da5-a64d-faee03f5c83d</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </system>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <os>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </os>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <features>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </features>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.config"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/console.log" append="off"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <video>
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </video>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:17:04 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:17:04 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:17:04 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:17:04 compute-0 nova_compute[192810]: </domain>
Sep 30 21:17:04 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.362 2 INFO nova.compute.manager [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.407 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.408 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.409 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Using config drive
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.469 2 INFO nova.scheduler.client.report [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Deleted allocation for migration d9519eac-2da2-481a-8fef-10150b102ff6
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.470 2 DEBUG nova.virt.libvirt.driver [None req-1fc91297-e7ca-4739-90f8-67d0467be5bf bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.540 2 INFO nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Creating config drive at /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.config
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.549 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwsnp5dgd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:04 compute-0 nova_compute[192810]: 2025-09-30 21:17:04.681 2 DEBUG oslo_concurrency.processutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwsnp5dgd" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:04 compute-0 systemd-machined[152794]: New machine qemu-3-instance-00000009.
Sep 30 21:17:04 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000009.
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.918 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267025.9173272, a99700e2-8d2c-4da5-a64d-faee03f5c83d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.919 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] VM Resumed (Lifecycle Event)
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.923 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.924 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.928 2 INFO nova.virt.libvirt.driver [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance spawned successfully.
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.929 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.949 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.956 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.959 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.960 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.960 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.961 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.961 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.962 2 DEBUG nova.virt.libvirt.driver [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.982 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.983 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267025.918934, a99700e2-8d2c-4da5-a64d-faee03f5c83d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:05 compute-0 nova_compute[192810]: 2025-09-30 21:17:05.983 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] VM Started (Lifecycle Event)
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.036 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.040 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.068 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.124 2 INFO nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Took 2.30 seconds to spawn the instance on the hypervisor.
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.125 2 DEBUG nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.270 2 INFO nova.compute.manager [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Took 3.15 seconds to build instance.
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.302 2 DEBUG oslo_concurrency.lockutils [None req-58786458-65ac-40e9-8790-5a666d663ce7 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:06 compute-0 nova_compute[192810]: 2025-09-30 21:17:06.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.156 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.157 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.157 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.158 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.158 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.176 2 INFO nova.compute.manager [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Terminating instance
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.190 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "refresh_cache-a99700e2-8d2c-4da5-a64d-faee03f5c83d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.191 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquired lock "refresh_cache-a99700e2-8d2c-4da5-a64d-faee03f5c83d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.191 2 DEBUG nova.network.neutron [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.380 2 DEBUG nova.network.neutron [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.831 2 DEBUG nova.network.neutron [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.850 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Releasing lock "refresh_cache-a99700e2-8d2c-4da5-a64d-faee03f5c83d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:08 compute-0 nova_compute[192810]: 2025-09-30 21:17:08.851 2 DEBUG nova.compute.manager [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:17:08 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Deactivated successfully.
Sep 30 21:17:08 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000009.scope: Consumed 4.074s CPU time.
Sep 30 21:17:08 compute-0 systemd-machined[152794]: Machine qemu-3-instance-00000009 terminated.
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.113 2 INFO nova.virt.libvirt.driver [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance destroyed successfully.
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.115 2 DEBUG nova.objects.instance [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lazy-loading 'resources' on Instance uuid a99700e2-8d2c-4da5-a64d-faee03f5c83d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.130 2 INFO nova.virt.libvirt.driver [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Deleting instance files /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d_del
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.131 2 INFO nova.virt.libvirt.driver [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Deletion of /var/lib/nova/instances/a99700e2-8d2c-4da5-a64d-faee03f5c83d_del complete
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.236 2 INFO nova.compute.manager [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.237 2 DEBUG oslo.service.loopingcall [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.238 2 DEBUG nova.compute.manager [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.238 2 DEBUG nova.network.neutron [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.508 2 DEBUG nova.network.neutron [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.519 2 DEBUG nova.network.neutron [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.526 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.527 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.545 2 INFO nova.compute.manager [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Took 0.31 seconds to deallocate network for instance.
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.547 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.653 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.654 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.669 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.748 2 DEBUG nova.compute.provider_tree [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.764 2 DEBUG nova.scheduler.client.report [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.792 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.796 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.808 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.809 2 INFO nova.compute.claims [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.821 2 INFO nova.scheduler.client.report [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Deleted allocations for instance a99700e2-8d2c-4da5-a64d-faee03f5c83d
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.943 2 DEBUG oslo_concurrency.lockutils [None req-6ea3ce96-2335-4f51-8a51-877ff1a7f9d9 a283310a99174bb794a56e8355b40a03 0d752ea56b394666bd18bda096b07530 - - default default] Lock "a99700e2-8d2c-4da5-a64d-faee03f5c83d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.970 2 DEBUG nova.compute.provider_tree [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:09 compute-0 nova_compute[192810]: 2025-09-30 21:17:09.984 2 DEBUG nova.scheduler.client.report [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.016 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.017 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.082 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.082 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.103 2 INFO nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.125 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.302 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.303 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.304 2 INFO nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating image(s)
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.305 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.305 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.306 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.317 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.406 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.407 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.408 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.419 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.483 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.484 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.520 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.522 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.522 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.550 2 DEBUG nova.policy [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.580 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.581 2 DEBUG nova.virt.disk.api [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Checking if we can resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.581 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.659 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.661 2 DEBUG nova.virt.disk.api [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Cannot resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.662 2 DEBUG nova.objects.instance [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'migration_context' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.679 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.680 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Ensure instance console log exists: /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.680 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.681 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:10 compute-0 nova_compute[192810]: 2025-09-30 21:17:10.682 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:11 compute-0 nova_compute[192810]: 2025-09-30 21:17:11.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:11 compute-0 podman[221431]: 2025-09-30 21:17:11.367686306 +0000 UTC m=+0.090584100 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:17:11 compute-0 nova_compute[192810]: 2025-09-30 21:17:11.566 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Successfully created port: bb8ecc8e-9cf4-4901-9788-83c49356f983 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.105 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267017.1039667, 252d5457-8837-4aa6-b309-c3139e8db7ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.107 2 INFO nova.compute.manager [-] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] VM Stopped (Lifecycle Event)
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.135 2 DEBUG nova.compute.manager [None req-7f29609f-6786-4f60-8bb0-b34bcab276bc - - - - - -] [instance: 252d5457-8837-4aa6-b309-c3139e8db7ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.599 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Successfully updated port: bb8ecc8e-9cf4-4901-9788-83c49356f983 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.627 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.628 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.628 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.733 2 DEBUG nova.compute.manager [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.733 2 DEBUG nova.compute.manager [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing instance network info cache due to event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.734 2 DEBUG oslo_concurrency.lockutils [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:12 compute-0 nova_compute[192810]: 2025-09-30 21:17:12.806 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.471 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.472 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.502 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.635 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.635 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.644 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.645 2 INFO nova.compute.claims [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.709 2 DEBUG nova.network.neutron [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.735 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.736 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance network_info: |[{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.737 2 DEBUG oslo_concurrency.lockutils [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.738 2 DEBUG nova.network.neutron [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.742 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Start _get_guest_xml network_info=[{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.747 2 WARNING nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.751 2 DEBUG nova.virt.libvirt.host [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.752 2 DEBUG nova.virt.libvirt.host [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.762 2 DEBUG nova.virt.libvirt.host [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.763 2 DEBUG nova.virt.libvirt.host [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.765 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.765 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.766 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.766 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.766 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.766 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.767 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.767 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.767 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.768 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.768 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.768 2 DEBUG nova.virt.hardware [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.772 2 DEBUG nova.virt.libvirt.vif [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:10Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.773 2 DEBUG nova.network.os_vif_util [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.774 2 DEBUG nova.network.os_vif_util [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.775 2 DEBUG nova.objects.instance [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.790 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <uuid>7f0e9e16-1467-41e5-b5b0-965591aa014c</uuid>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <name>instance-0000000b</name>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-772926731</nova:name>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:17:13</nova:creationTime>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:user uuid="981e96ea2bc2419d9a1e57d6aed70304">tempest-LiveAutoBlockMigrationV225Test-860972404-project-member</nova:user>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:project uuid="544a33c53701466d8bf7e8ed34f38dcb">tempest-LiveAutoBlockMigrationV225Test-860972404</nova:project>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         <nova:port uuid="bb8ecc8e-9cf4-4901-9788-83c49356f983">
Sep 30 21:17:13 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <system>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="serial">7f0e9e16-1467-41e5-b5b0-965591aa014c</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="uuid">7f0e9e16-1467-41e5-b5b0-965591aa014c</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </system>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <os>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </os>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <features>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </features>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:45:9d:62"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <target dev="tapbb8ecc8e-9c"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/console.log" append="off"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <video>
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </video>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:17:13 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:17:13 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:17:13 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:17:13 compute-0 nova_compute[192810]: </domain>
Sep 30 21:17:13 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.791 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Preparing to wait for external event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.792 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.792 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.792 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.793 2 DEBUG nova.virt.libvirt.vif [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:10Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.793 2 DEBUG nova.network.os_vif_util [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.794 2 DEBUG nova.network.os_vif_util [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.794 2 DEBUG os_vif [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.795 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.796 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.800 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb8ecc8e-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.800 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb8ecc8e-9c, col_values=(('external_ids', {'iface-id': 'bb8ecc8e-9cf4-4901-9788-83c49356f983', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:9d:62', 'vm-uuid': '7f0e9e16-1467-41e5-b5b0-965591aa014c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.827 2 DEBUG nova.compute.provider_tree [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:13 compute-0 NetworkManager[51733]: <info>  [1759267033.8372] manager: (tapbb8ecc8e-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.842 2 DEBUG nova.scheduler.client.report [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.849 2 INFO os_vif [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.872 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.873 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.941 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.942 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.942 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] No VIF found with MAC fa:16:3e:45:9d:62, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.943 2 INFO nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Using config drive
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.963 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.964 2 DEBUG nova.network.neutron [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:17:13 compute-0 nova_compute[192810]: 2025-09-30 21:17:13.981 2 INFO nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.011 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.143 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.144 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.145 2 INFO nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating image(s)
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.145 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.146 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.147 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.164 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.246 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.248 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.249 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.261 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.322 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.323 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.376 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.378 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.378 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.448 2 DEBUG nova.policy [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b263d7c3e3141f999e8eabf49e8190c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '96460712956e4f038121397afa979163', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.462 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.464 2 DEBUG nova.virt.disk.api [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Checking if we can resize image /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.464 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.538 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.539 2 DEBUG nova.virt.disk.api [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Cannot resize image /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.540 2 DEBUG nova.objects.instance [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'migration_context' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.554 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.555 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Ensure instance console log exists: /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.556 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.556 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.557 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.645 2 INFO nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating config drive at /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.653 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_x2yp6gu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.784 2 DEBUG oslo_concurrency.processutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_x2yp6gu" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:14 compute-0 kernel: tapbb8ecc8e-9c: entered promiscuous mode
Sep 30 21:17:14 compute-0 ovn_controller[94912]: 2025-09-30T21:17:14Z|00047|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this chassis.
Sep 30 21:17:14 compute-0 ovn_controller[94912]: 2025-09-30T21:17:14Z|00048|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:14 compute-0 NetworkManager[51733]: <info>  [1759267034.8537] manager: (tapbb8ecc8e-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.868 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.870 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 bound to our chassis
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.873 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.894 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8d425a-160e-4549-9d71-969ab94fe1fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.895 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap934fff90-51 in ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:14 compute-0 systemd-udevd[221489]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.897 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap934fff90-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.897 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[22223f67-a798-4d94-a139-aaf2248fddb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.898 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5763b49a-f2f8-4919-b62f-ff05573d4c86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:14 compute-0 systemd-machined[152794]: New machine qemu-4-instance-0000000b.
Sep 30 21:17:14 compute-0 NetworkManager[51733]: <info>  [1759267034.9153] device (tapbb8ecc8e-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:14 compute-0 NetworkManager[51733]: <info>  [1759267034.9166] device (tapbb8ecc8e-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.925 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[3dc89ae9-3c0b-46d9-af4c-c2893b923593]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:14 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000b.
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:14 compute-0 ovn_controller[94912]: 2025-09-30T21:17:14Z|00049|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 ovn-installed in OVS
Sep 30 21:17:14 compute-0 ovn_controller[94912]: 2025-09-30T21:17:14Z|00050|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 up in Southbound
Sep 30 21:17:14 compute-0 nova_compute[192810]: 2025-09-30 21:17:14.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:14.990 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab828f9-fdec-4aac-8364-a66f7118d674]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.021 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ac21d9cc-d2aa-4966-a20d-50096465f5da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 NetworkManager[51733]: <info>  [1759267035.0314] manager: (tap934fff90-50): new Veth device (/org/freedesktop/NetworkManager/Devices/34)
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.029 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7e74ab02-9396-4713-b85c-33a1e9044cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 systemd-udevd[221492]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.069 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ced15a-8b93-4408-b21e-10f1266ac678]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.072 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[82eb4c30-bde4-4e40-9e48-832d8ed1576a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 NetworkManager[51733]: <info>  [1759267035.0991] device (tap934fff90-50): carrier: link connected
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.105 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[08a005fc-4b4c-49d7-a2e3-c9aa60dfa3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.128 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa881b2-708d-4788-b0d6-5086afbbb9ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373072, 'reachable_time': 18253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221524, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.151 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b88fd669-fe2b-4beb-976d-5076e656119c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:3765'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373072, 'tstamp': 373072}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221525, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.175 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0c351cb5-46c0-4e73-b816-120c6fc7d7b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373072, 'reachable_time': 18253, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221526, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.216 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[755c2e0c-5eeb-4f7c-a1a5-fa41e6a4b9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.291 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[75b6d0d2-eb51-4278-ad37-17a23be3d809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.293 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.293 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.294 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:15 compute-0 kernel: tap934fff90-50: entered promiscuous mode
Sep 30 21:17:15 compute-0 NetworkManager[51733]: <info>  [1759267035.2978] manager: (tap934fff90-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.302 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.305 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:15 compute-0 ovn_controller[94912]: 2025-09-30T21:17:15Z|00051|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.310 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a36827-fff5-4798-a17b-cc463b454f2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.310 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:15.311 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'env', 'PROCESS_TAG=haproxy-934fff90-5446-41f1-a5ad-d2568cb337b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/934fff90-5446-41f1-a5ad-d2568cb337b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.429 2 DEBUG nova.network.neutron [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updated VIF entry in instance network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.429 2 DEBUG nova.network.neutron [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.457 2 DEBUG oslo_concurrency.lockutils [req-ccbde71d-18a3-4060-af98-363ca364ea4d req-3cfa7aa3-2256-4180-8e5e-b86b69df676f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:15 compute-0 podman[221565]: 2025-09-30 21:17:15.705481988 +0000 UTC m=+0.053287414 container create f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:17:15 compute-0 systemd[1]: Started libpod-conmon-f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589.scope.
Sep 30 21:17:15 compute-0 podman[221565]: 2025-09-30 21:17:15.676400939 +0000 UTC m=+0.024206395 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:15 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:17:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d20bdeae83fbdf1adf736b7f2a909b72d952201ea1083e353f805aac1a25c508/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:15 compute-0 podman[221565]: 2025-09-30 21:17:15.804268076 +0000 UTC m=+0.152073512 container init f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:17:15 compute-0 podman[221565]: 2025-09-30 21:17:15.811714475 +0000 UTC m=+0.159519901 container start f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:17:15 compute-0 podman[221581]: 2025-09-30 21:17:15.81626799 +0000 UTC m=+0.066265973 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:17:15 compute-0 podman[221578]: 2025-09-30 21:17:15.835485898 +0000 UTC m=+0.088148219 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:17:15 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [NOTICE]   (221621) : New worker (221624) forked
Sep 30 21:17:15 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [NOTICE]   (221621) : Loading success.
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.881 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267035.880982, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.882 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Started (Lifecycle Event)
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.905 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.910 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267035.8826265, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.910 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Paused (Lifecycle Event)
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.931 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.935 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:15 compute-0 nova_compute[192810]: 2025-09-30 21:17:15.954 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.057 2 DEBUG nova.network.neutron [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Successfully updated port: 730a9e74-900e-49b2-a5c3-043d6da1a52b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.085 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.086 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.086 2 DEBUG nova.network.neutron [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.167 2 DEBUG nova.compute.manager [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-changed-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.167 2 DEBUG nova.compute.manager [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Refreshing instance network info cache due to event network-changed-730a9e74-900e-49b2-a5c3-043d6da1a52b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.168 2 DEBUG oslo_concurrency.lockutils [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.262 2 DEBUG nova.network.neutron [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:17:16 compute-0 nova_compute[192810]: 2025-09-30 21:17:16.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.326 2 DEBUG nova.network.neutron [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.343 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.344 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance network_info: |[{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.345 2 DEBUG oslo_concurrency.lockutils [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.346 2 DEBUG nova.network.neutron [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Refreshing network info cache for port 730a9e74-900e-49b2-a5c3-043d6da1a52b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.352 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Start _get_guest_xml network_info=[{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.359 2 WARNING nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.371 2 DEBUG nova.virt.libvirt.host [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.372 2 DEBUG nova.virt.libvirt.host [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.376 2 DEBUG nova.virt.libvirt.host [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.377 2 DEBUG nova.virt.libvirt.host [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.380 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.381 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.382 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.383 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.383 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.384 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.384 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.385 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.386 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.386 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.387 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.388 2 DEBUG nova.virt.hardware [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.396 2 DEBUG nova.virt.libvirt.vif [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:14Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.397 2 DEBUG nova.network.os_vif_util [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.398 2 DEBUG nova.network.os_vif_util [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.400 2 DEBUG nova.objects.instance [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lazy-loading 'pci_devices' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.414 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <uuid>b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65</uuid>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <name>instance-0000000c</name>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:name>tempest-LiveMigrationTest-server-2078302607</nova:name>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:17:17</nova:creationTime>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:user uuid="4b263d7c3e3141f999e8eabf49e8190c">tempest-LiveMigrationTest-2029274765-project-member</nova:user>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:project uuid="96460712956e4f038121397afa979163">tempest-LiveMigrationTest-2029274765</nova:project>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         <nova:port uuid="730a9e74-900e-49b2-a5c3-043d6da1a52b">
Sep 30 21:17:17 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <system>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="serial">b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="uuid">b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </system>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <os>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </os>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <features>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </features>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:72:47:71"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <target dev="tap730a9e74-90"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/console.log" append="off"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <video>
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </video>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:17:17 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:17:17 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:17:17 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:17:17 compute-0 nova_compute[192810]: </domain>
Sep 30 21:17:17 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.417 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Preparing to wait for external event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.417 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.417 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.418 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.418 2 DEBUG nova.virt.libvirt.vif [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:14Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.419 2 DEBUG nova.network.os_vif_util [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.419 2 DEBUG nova.network.os_vif_util [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.420 2 DEBUG os_vif [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.425 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap730a9e74-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.425 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap730a9e74-90, col_values=(('external_ids', {'iface-id': '730a9e74-900e-49b2-a5c3-043d6da1a52b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:47:71', 'vm-uuid': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-0 NetworkManager[51733]: <info>  [1759267037.4286] manager: (tap730a9e74-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.437 2 INFO os_vif [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90')
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.494 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.495 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.495 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] No VIF found with MAC fa:16:3e:72:47:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:17:17 compute-0 nova_compute[192810]: 2025-09-30 21:17:17.496 2 INFO nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Using config drive
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.210 2 INFO nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Creating config drive at /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.220 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrx4ikh9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.372 2 DEBUG oslo_concurrency.processutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrx4ikh9" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:18 compute-0 kernel: tap730a9e74-90: entered promiscuous mode
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.4554] manager: (tap730a9e74-90): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00052|binding|INFO|Claiming lport 730a9e74-900e-49b2-a5c3-043d6da1a52b for this chassis.
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00053|binding|INFO|730a9e74-900e-49b2-a5c3-043d6da1a52b: Claiming fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00054|binding|INFO|Claiming lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 for this chassis.
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00055|binding|INFO|9ece0208-0151-4f0a-bda7-acd45fe4f2a0: Claiming fa:16:3e:28:e2:ac 19.80.0.112
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.508 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.511 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.514 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 bound to our chassis
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.517 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:18 compute-0 systemd-udevd[221653]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:18 compute-0 systemd-machined[152794]: New machine qemu-5-instance-0000000c.
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.539 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[792059de-5f8e-44b9-ae5b-d9c05f281642]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.540 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4a7886e9-21 in ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.542 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4a7886e9-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.543 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1984cb49-432d-472e-9215-6414be3da454]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.544 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0d363382-4205-4b92-bd3f-37a36fdd9747]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.5525] device (tap730a9e74-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.5544] device (tap730a9e74-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:18 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00056|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b ovn-installed in OVS
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00057|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b up in Southbound
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00058|binding|INFO|Setting lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 up in Southbound
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.568 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ce9af5-fda3-4b0f-9a9f-86237a659f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.607 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[296e8dec-0dbc-4832-b89b-6aecb38154a9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.645 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[db736787-1df4-4a42-8bcf-f27b11d9a4b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.6558] manager: (tap4a7886e9-20): new Veth device (/org/freedesktop/NetworkManager/Devices/38)
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.655 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[90072f4e-9eef-4b64-8684-eb90ce56654d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 systemd-udevd[221658]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.702 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6e79aaf8-70a0-4803-9fa9-f8d36d3db083]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.706 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e39a015d-e50d-4231-b8c0-edf62b4c7c4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.7350] device (tap4a7886e9-20): carrier: link connected
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.744 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b261d2-1886-4bb9-89ef-53935997ea07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.767 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[51687880-2352-4317-af30-47479ed4e502]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a7886e9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:12:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373435, 'reachable_time': 30525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221688, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.791 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5e0d4451-f6e9-4386-b677-676a1b2c76f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:12cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373435, 'tstamp': 373435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221689, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.815 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4357077a-6078-44c6-be2f-f90e17e33a06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4a7886e9-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:12:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373435, 'reachable_time': 30525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221690, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dce824-464a-4c71-873c-fad59172d605]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.891 2 DEBUG nova.compute.manager [req-d747529e-dbfb-4ae4-9737-f5ef99a1eb43 req-dc01dc15-77fc-49d1-832c-72c74cc9af92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.892 2 DEBUG oslo_concurrency.lockutils [req-d747529e-dbfb-4ae4-9737-f5ef99a1eb43 req-dc01dc15-77fc-49d1-832c-72c74cc9af92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.892 2 DEBUG oslo_concurrency.lockutils [req-d747529e-dbfb-4ae4-9737-f5ef99a1eb43 req-dc01dc15-77fc-49d1-832c-72c74cc9af92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.893 2 DEBUG oslo_concurrency.lockutils [req-d747529e-dbfb-4ae4-9737-f5ef99a1eb43 req-dc01dc15-77fc-49d1-832c-72c74cc9af92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.893 2 DEBUG nova.compute.manager [req-d747529e-dbfb-4ae4-9737-f5ef99a1eb43 req-dc01dc15-77fc-49d1-832c-72c74cc9af92 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Processing event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.931 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29b4cabd-be47-42dd-a786-e3521aea9753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.933 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a7886e9-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.934 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.934 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a7886e9-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:18 compute-0 NetworkManager[51733]: <info>  [1759267038.9380] manager: (tap4a7886e9-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Sep 30 21:17:18 compute-0 kernel: tap4a7886e9-20: entered promiscuous mode
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.944 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4a7886e9-20, col_values=(('external_ids', {'iface-id': '976cb173-259a-473b-830a-8c627acdbeaf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.945 2 DEBUG nova.network.neutron [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updated VIF entry in instance network info cache for port 730a9e74-900e-49b2-a5c3-043d6da1a52b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:18 compute-0 ovn_controller[94912]: 2025-09-30T21:17:18Z|00059|binding|INFO|Releasing lport 976cb173-259a-473b-830a-8c627acdbeaf from this chassis (sb_readonly=0)
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.945 2 DEBUG nova.network.neutron [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.964 2 DEBUG oslo_concurrency.lockutils [req-5554c26c-b8da-42ca-a42c-7610e8d8f206 req-b16984cc-6ca2-4cfd-be1f-111d43562149 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:18 compute-0 nova_compute[192810]: 2025-09-30 21:17:18.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.973 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.974 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6fae62-1460-49e1-9429-f724c65f65b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.975 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/4a7886e9-2920-46a8-89e7-811c01f2e7c6.pid.haproxy
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 4a7886e9-2920-46a8-89e7-811c01f2e7c6
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:18.978 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'env', 'PROCESS_TAG=haproxy-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4a7886e9-2920-46a8-89e7-811c01f2e7c6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:19 compute-0 podman[221726]: 2025-09-30 21:17:19.436499935 +0000 UTC m=+0.071685081 container create 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:17:19 compute-0 systemd[1]: Started libpod-conmon-63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92.scope.
Sep 30 21:17:19 compute-0 podman[221726]: 2025-09-30 21:17:19.400663405 +0000 UTC m=+0.035848601 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:19 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:17:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2cbf00f05d659d809e9ba58014e60a790d4d4aa7eb5553158afe4d75e910c6fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:19 compute-0 podman[221726]: 2025-09-30 21:17:19.547656017 +0000 UTC m=+0.182841183 container init 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:17:19 compute-0 podman[221726]: 2025-09-30 21:17:19.554414348 +0000 UTC m=+0.189599494 container start 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:19 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [NOTICE]   (221745) : New worker (221747) forked
Sep 30 21:17:19 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [NOTICE]   (221745) : Loading success.
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.616 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.620 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.631 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa3920e-6029-4208-97be-b4ef4a5ea262]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.632 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16d40025-11 in ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.634 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16d40025-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.634 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[17bd1450-7a21-4adb-8e98-e0f62665ed09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.635 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf208b1-5b9e-4f89-8625-67086d10c137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.649 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b434ace2-0385-4c83-a71e-39abfae8e4b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.671 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cb37efa6-fa1f-4f1f-822b-af88a713749d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.707 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[47d0bc8e-6d70-4205-9aa5-5386bc7d61b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 NetworkManager[51733]: <info>  [1759267039.7150] manager: (tap16d40025-10): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Sep 30 21:17:19 compute-0 systemd-udevd[221670]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.713 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e893e4e8-0b31-4fdd-913b-5941b8ac9047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.766 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[04060c25-1feb-4812-8645-b3469ba7129a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.771 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e142452c-04ac-4816-ae42-40766bc2f3fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 NetworkManager[51733]: <info>  [1759267039.8110] device (tap16d40025-10): carrier: link connected
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.821 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[68330249-d858-4880-a486-12ed4ff296e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.847 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a960973-5647-4107-97a5-35ca70f2e692]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373543, 'reachable_time': 38891, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221766, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.863 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267039.8626645, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.864 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Started (Lifecycle Event)
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.867 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.873 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.877 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d85f00-db25-43a4-a71e-646f0a71dbc6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe64:c752'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373543, 'tstamp': 373543}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221767, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.880 2 INFO nova.virt.libvirt.driver [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance spawned successfully.
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.881 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.888 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.892 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.896 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5b621bbd-2e81-4d58-bc1d-47e45347e31e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16d40025-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:64:c7:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373543, 'reachable_time': 38891, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221768, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.912 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.913 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.913 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.914 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.914 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.915 2 DEBUG nova.virt.libvirt.driver [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.919 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.920 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267039.8664238, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.920 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Paused (Lifecycle Event)
Sep 30 21:17:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:19.940 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6dab936a-1982-4a0c-9386-faa0074648d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.950 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.953 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267039.871882, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.954 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Resumed (Lifecycle Event)
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.980 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:19 compute-0 nova_compute[192810]: 2025-09-30 21:17:19.988 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.003 2 INFO nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Took 5.86 seconds to spawn the instance on the hypervisor.
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.004 2 DEBUG nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.007 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.047 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb570cf-fe70-4c7f-a154-d5b3f2d3f683]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.049 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.049 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.050 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16d40025-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:20 compute-0 NetworkManager[51733]: <info>  [1759267040.0542] manager: (tap16d40025-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Sep 30 21:17:20 compute-0 kernel: tap16d40025-10: entered promiscuous mode
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.060 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16d40025-10, col_values=(('external_ids', {'iface-id': '0c66892e-7baf-4f9a-a329-dd0545dbf700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:20 compute-0 ovn_controller[94912]: 2025-09-30T21:17:20Z|00060|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.066 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.067 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a58fa1d5-92b6-491e-8284-d01713206555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.068 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/16d40025-1087-460f-a42f-c007f6eff406.pid.haproxy
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 16d40025-1087-460f-a42f-c007f6eff406
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:20.069 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'env', 'PROCESS_TAG=haproxy-16d40025-1087-460f-a42f-c007f6eff406', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16d40025-1087-460f-a42f-c007f6eff406.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.241 2 INFO nova.compute.manager [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Took 6.64 seconds to build instance.
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.263 2 DEBUG oslo_concurrency.lockutils [None req-1f21cef3-ff82-49f7-aa75-0197b3949ce8 4b263d7c3e3141f999e8eabf49e8190c 96460712956e4f038121397afa979163 - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:20 compute-0 podman[221801]: 2025-09-30 21:17:20.498839823 +0000 UTC m=+0.077681363 container create fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:17:20 compute-0 systemd[1]: Started libpod-conmon-fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc.scope.
Sep 30 21:17:20 compute-0 podman[221801]: 2025-09-30 21:17:20.45301999 +0000 UTC m=+0.031861580 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:20 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:17:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67493d0b0ffdda01faef13c054f5e134213eb07cfbbef7fe0c824bdf1e0d45fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:20 compute-0 podman[221801]: 2025-09-30 21:17:20.593823414 +0000 UTC m=+0.172664944 container init fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:17:20 compute-0 podman[221801]: 2025-09-30 21:17:20.602877244 +0000 UTC m=+0.181718754 container start fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:17:20 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [NOTICE]   (221837) : New worker (221844) forked
Sep 30 21:17:20 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [NOTICE]   (221837) : Loading success.
Sep 30 21:17:20 compute-0 podman[221814]: 2025-09-30 21:17:20.662058387 +0000 UTC m=+0.112610450 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.957 2 DEBUG nova.compute.manager [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.957 2 DEBUG oslo_concurrency.lockutils [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.958 2 DEBUG oslo_concurrency.lockutils [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.958 2 DEBUG oslo_concurrency.lockutils [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.959 2 DEBUG nova.compute.manager [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:20 compute-0 nova_compute[192810]: 2025-09-30 21:17:20.960 2 WARNING nova.compute.manager [req-4bfcb31f-9bbc-48f0-89d5-80e0d4a1b3e3 req-dd8d4c20-662b-4c21-8638-7bcaf7d2c757 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state None.
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.108 2 DEBUG nova.compute.manager [req-f7a184ec-b5bd-46ab-aa3f-c4a5b2feda77 req-a9daf1f0-1bbe-49e1-97e9-b4f0c573c8ff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.109 2 DEBUG oslo_concurrency.lockutils [req-f7a184ec-b5bd-46ab-aa3f-c4a5b2feda77 req-a9daf1f0-1bbe-49e1-97e9-b4f0c573c8ff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.110 2 DEBUG oslo_concurrency.lockutils [req-f7a184ec-b5bd-46ab-aa3f-c4a5b2feda77 req-a9daf1f0-1bbe-49e1-97e9-b4f0c573c8ff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.110 2 DEBUG oslo_concurrency.lockutils [req-f7a184ec-b5bd-46ab-aa3f-c4a5b2feda77 req-a9daf1f0-1bbe-49e1-97e9-b4f0c573c8ff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.111 2 DEBUG nova.compute.manager [req-f7a184ec-b5bd-46ab-aa3f-c4a5b2feda77 req-a9daf1f0-1bbe-49e1-97e9-b4f0c573c8ff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Processing event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.112 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.129 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267041.1180713, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.130 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Resumed (Lifecycle Event)
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.134 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.141 2 INFO nova.virt.libvirt.driver [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance spawned successfully.
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.141 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.159 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.170 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.175 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.175 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.176 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.177 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.177 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.178 2 DEBUG nova.virt.libvirt.driver [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.187 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.266 2 INFO nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 10.96 seconds to spawn the instance on the hypervisor.
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.266 2 DEBUG nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.357 2 INFO nova.compute.manager [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 11.74 seconds to build instance.
Sep 30 21:17:21 compute-0 nova_compute[192810]: 2025-09-30 21:17:21.373 2 DEBUG oslo_concurrency.lockutils [None req-58063b40-3100-4800-b891-bd6b52b36956 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:22 compute-0 podman[221857]: 2025-09-30 21:17:22.364008714 +0000 UTC m=+0.085820510 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:17:22 compute-0 nova_compute[192810]: 2025-09-30 21:17:22.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.226 2 DEBUG nova.compute.manager [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.227 2 DEBUG oslo_concurrency.lockutils [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.227 2 DEBUG oslo_concurrency.lockutils [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.227 2 DEBUG oslo_concurrency.lockutils [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.228 2 DEBUG nova.compute.manager [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:23 compute-0 nova_compute[192810]: 2025-09-30 21:17:23.228 2 WARNING nova.compute.manager [req-50aba14f-82a3-4559-8582-a253b1b53d7f req-90db5102-ed41-4372-83cd-468b94819af4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state None.
Sep 30 21:17:24 compute-0 nova_compute[192810]: 2025-09-30 21:17:24.112 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267029.1114976, a99700e2-8d2c-4da5-a64d-faee03f5c83d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:24 compute-0 nova_compute[192810]: 2025-09-30 21:17:24.114 2 INFO nova.compute.manager [-] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] VM Stopped (Lifecycle Event)
Sep 30 21:17:24 compute-0 nova_compute[192810]: 2025-09-30 21:17:24.142 2 DEBUG nova.compute.manager [None req-b70b6192-dfcd-4fc8-80e2-d2554940e962 - - - - - -] [instance: a99700e2-8d2c-4da5-a64d-faee03f5c83d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.584 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Check if temp file /var/lib/nova/instances/tmpghl6923x exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.585 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.758 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Check if temp file /var/lib/nova/instances/tmps3z4e6mm exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.764 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.830 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.832 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.892 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:26 compute-0 nova_compute[192810]: 2025-09-30 21:17:26.894 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:17:27 compute-0 podman[221884]: 2025-09-30 21:17:27.360023624 +0000 UTC m=+0.086736733 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:17:27 compute-0 nova_compute[192810]: 2025-09-30 21:17:27.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:27 compute-0 nova_compute[192810]: 2025-09-30 21:17:27.857 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:27 compute-0 nova_compute[192810]: 2025-09-30 21:17:27.972 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:27 compute-0 nova_compute[192810]: 2025-09-30 21:17:27.975 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:28 compute-0 nova_compute[192810]: 2025-09-30 21:17:28.062 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:31 compute-0 sshd-session[221921]: Accepted publickey for nova from 192.168.122.101 port 40790 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:17:31 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:17:31 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:17:31 compute-0 systemd-logind[792]: New session 30 of user nova.
Sep 30 21:17:31 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:17:31 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:17:31 compute-0 systemd[221926]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:31 compute-0 nova_compute[192810]: 2025-09-30 21:17:31.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:31 compute-0 systemd[221926]: Queued start job for default target Main User Target.
Sep 30 21:17:31 compute-0 systemd[221926]: Created slice User Application Slice.
Sep 30 21:17:31 compute-0 systemd[221926]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:31 compute-0 systemd[221926]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:31 compute-0 systemd[221926]: Reached target Paths.
Sep 30 21:17:31 compute-0 systemd[221926]: Reached target Timers.
Sep 30 21:17:31 compute-0 systemd[221926]: Starting D-Bus User Message Bus Socket...
Sep 30 21:17:31 compute-0 systemd[221926]: Starting Create User's Volatile Files and Directories...
Sep 30 21:17:31 compute-0 systemd[221926]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:17:31 compute-0 systemd[221926]: Reached target Sockets.
Sep 30 21:17:31 compute-0 systemd[221926]: Finished Create User's Volatile Files and Directories.
Sep 30 21:17:31 compute-0 systemd[221926]: Reached target Basic System.
Sep 30 21:17:31 compute-0 systemd[221926]: Reached target Main User Target.
Sep 30 21:17:31 compute-0 systemd[221926]: Startup finished in 211ms.
Sep 30 21:17:31 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:17:31 compute-0 systemd[1]: Started Session 30 of User nova.
Sep 30 21:17:31 compute-0 sshd-session[221921]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:31 compute-0 sshd-session[221941]: Received disconnect from 192.168.122.101 port 40790:11: disconnected by user
Sep 30 21:17:31 compute-0 sshd-session[221941]: Disconnected from user nova 192.168.122.101 port 40790
Sep 30 21:17:31 compute-0 sshd-session[221921]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:17:31 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Sep 30 21:17:31 compute-0 systemd-logind[792]: Session 30 logged out. Waiting for processes to exit.
Sep 30 21:17:31 compute-0 systemd-logind[792]: Removed session 30.
Sep 30 21:17:31 compute-0 ovn_controller[94912]: 2025-09-30T21:17:31Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:31 compute-0 ovn_controller[94912]: 2025-09-30T21:17:31Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:31 compute-0 sshd-session[221877]: Invalid user treyon from 80.94.95.112 port 63662
Sep 30 21:17:31 compute-0 sshd-session[221877]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:31 compute-0 sshd-session[221877]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.112
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.782 2 DEBUG nova.compute.manager [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.783 2 DEBUG oslo_concurrency.lockutils [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.783 2 DEBUG oslo_concurrency.lockutils [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.783 2 DEBUG oslo_concurrency.lockutils [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.784 2 DEBUG nova.compute.manager [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.784 2 DEBUG nova.compute.manager [req-8d35918f-5419-4be9-9cf4-3464f0a6bfd1 req-7c46ea0e-215c-4167-9927-8373a1470d5a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:32 compute-0 nova_compute[192810]: 2025-09-30 21:17:32.803 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.147 2 INFO nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 5.08 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.148 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.172 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpghl6923x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(0a2a09d7-020a-40a6-b7d6-da99acd63254),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.199 2 DEBUG nova.objects.instance [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.201 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.203 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.203 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.218 2 DEBUG nova.virt.libvirt.vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:21Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.219 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.219 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.220 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:17:33 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:45:9d:62"/>
Sep 30 21:17:33 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:17:33 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:33 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:17:33 compute-0 nova_compute[192810]:   <target dev="tapbb8ecc8e-9c"/>
Sep 30 21:17:33 compute-0 nova_compute[192810]: </interface>
Sep 30 21:17:33 compute-0 nova_compute[192810]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.221 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.707 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.707 2 INFO nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:17:33 compute-0 nova_compute[192810]: 2025-09-30 21:17:33.801 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:17:33 compute-0 ovn_controller[94912]: 2025-09-30T21:17:33Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:33 compute-0 ovn_controller[94912]: 2025-09-30T21:17:33Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.304 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.305 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:34 compute-0 podman[221955]: 2025-09-30 21:17:34.360464121 +0000 UTC m=+0.089474242 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:17:34 compute-0 podman[221956]: 2025-09-30 21:17:34.373743828 +0000 UTC m=+0.097288681 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41)
Sep 30 21:17:34 compute-0 sshd-session[221877]: Failed password for invalid user treyon from 80.94.95.112 port 63662 ssh2
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.810 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.811 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.895 2 DEBUG nova.compute.manager [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.895 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.896 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.896 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.897 2 DEBUG nova.compute.manager [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.897 2 WARNING nova.compute.manager [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.898 2 DEBUG nova.compute.manager [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.898 2 DEBUG nova.compute.manager [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing instance network info cache due to event network-changed-bb8ecc8e-9cf4-4901-9788-83c49356f983. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.899 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.899 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:34 compute-0 nova_compute[192810]: 2025-09-30 21:17:34.900 2 DEBUG nova.network.neutron [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Refreshing network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.314 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.315 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.807 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.808 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.808 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.808 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.818 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.818 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.887 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.952 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:35 compute-0 nova_compute[192810]: 2025-09-30 21:17:35.953 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.018 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.026 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.091 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.093 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.178 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.321 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.322 2 DEBUG nova.virt.libvirt.migration [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.422 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.423 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5365MB free_disk=73.40433120727539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.423 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.424 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.468 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267056.4684353, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.469 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Paused (Lifecycle Event)
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.506 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating resource usage from migration 0a2a09d7-020a-40a6-b7d6-da99acd63254
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.506 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating resource usage from migration 75c88f2a-0d52-4e9d-8432-8454d77751e5
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.511 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.514 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.535 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.545 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration 0a2a09d7-020a-40a6-b7d6-da99acd63254 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.545 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration 75c88f2a-0d52-4e9d-8432-8454d77751e5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.545 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.545 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:17:36 compute-0 sshd-session[221877]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:36 compute-0 kernel: tapbb8ecc8e-9c (unregistering): left promiscuous mode
Sep 30 21:17:36 compute-0 NetworkManager[51733]: <info>  [1759267056.8106] device (tapbb8ecc8e-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.822 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:36 compute-0 ovn_controller[94912]: 2025-09-30T21:17:36Z|00061|binding|INFO|Releasing lport bb8ecc8e-9cf4-4901-9788-83c49356f983 from this chassis (sb_readonly=0)
Sep 30 21:17:36 compute-0 ovn_controller[94912]: 2025-09-30T21:17:36Z|00062|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 down in Southbound
Sep 30 21:17:36 compute-0 ovn_controller[94912]: 2025-09-30T21:17:36Z|00063|binding|INFO|Removing iface tapbb8ecc8e-9c ovn-installed in OVS
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:36.842 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:36.844 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:17:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:36.846 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:36.853 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[026f149c-38ee-4f44-be2b-a0453b39033b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:36.854 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace which is not needed anymore
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.858 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:36 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.887 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:17:36 compute-0 nova_compute[192810]: 2025-09-30 21:17:36.888 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:36 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000b.scope: Consumed 13.159s CPU time.
Sep 30 21:17:36 compute-0 systemd-machined[152794]: Machine qemu-4-instance-0000000b terminated.
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [NOTICE]   (221621) : haproxy version is 2.8.14-c23fe91
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [NOTICE]   (221621) : path to executable is /usr/sbin/haproxy
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [WARNING]  (221621) : Exiting Master process...
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [WARNING]  (221621) : Exiting Master process...
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [ALERT]    (221621) : Current worker (221624) exited with code 143 (Terminated)
Sep 30 21:17:37 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[221587]: [WARNING]  (221621) : All workers exited. Exiting... (0)
Sep 30 21:17:37 compute-0 systemd[1]: libpod-f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589.scope: Deactivated successfully.
Sep 30 21:17:37 compute-0 podman[222043]: 2025-09-30 21:17:37.056004191 +0000 UTC m=+0.069875325 container died f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.062 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.064 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.064 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.064 2 DEBUG nova.virt.libvirt.guest [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.064 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migration operation has completed
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.065 2 INFO nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] _post_live_migration() is started..
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.076 2 DEBUG nova.network.neutron [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updated VIF entry in instance network info cache for port bb8ecc8e-9cf4-4901-9788-83c49356f983. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.077 2 DEBUG nova.network.neutron [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589-userdata-shm.mount: Deactivated successfully.
Sep 30 21:17:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-d20bdeae83fbdf1adf736b7f2a909b72d952201ea1083e353f805aac1a25c508-merged.mount: Deactivated successfully.
Sep 30 21:17:37 compute-0 podman[222043]: 2025-09-30 21:17:37.101462995 +0000 UTC m=+0.115334129 container cleanup f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.111 2 DEBUG oslo_concurrency.lockutils [req-db11594d-f205-4893-9697-daed16fbbe05 req-ca978f25-1fa9-4021-9e22-b1c637784ca0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:37 compute-0 systemd[1]: libpod-conmon-f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589.scope: Deactivated successfully.
Sep 30 21:17:37 compute-0 podman[222088]: 2025-09-30 21:17:37.192771533 +0000 UTC m=+0.056753721 container remove f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.202 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[303182e2-6419-4d70-a5ce-1272d934abe9]: (4, ('Tue Sep 30 09:17:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589)\nf3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589\nTue Sep 30 09:17:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (f3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589)\nf3996c7367e9f1965f37877c5373b825bbe3211a703a6a31dd449ddc49a70589\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.204 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[36902bc8-f432-4ff4-b7b9-4e4427dd3da4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.206 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:37 compute-0 kernel: tap934fff90-50: left promiscuous mode
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.234 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[58678991-85fa-4db7-b765-ea2adaca0b01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.270 2 DEBUG nova.compute.manager [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.270 2 DEBUG oslo_concurrency.lockutils [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.270 2 DEBUG oslo_concurrency.lockutils [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.271 2 DEBUG oslo_concurrency.lockutils [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.271 2 DEBUG nova.compute.manager [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.271 2 DEBUG nova.compute.manager [req-14ce38e6-8a7d-434e-8f6e-7607ae2ff51c req-40066baa-ba62-48b3-8aff-6e94a2c07a22 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.288 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[167de966-d0c6-4119-ad0d-44b0ac5361cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.290 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cb549e17-7277-48d2-a354-fb9ade10bf5c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.311 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc627cb-c33d-4645-b362-cd640f175b8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373064, 'reachable_time': 18079, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222108, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.316 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:17:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d934fff90\x2d5446\x2d41f1\x2da5ad\x2dd2568cb337b1.mount: Deactivated successfully.
Sep 30 21:17:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:37.316 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1d301645-ab90-4358-a7c4-09016c1c1cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.686 2 DEBUG nova.compute.manager [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.686 2 DEBUG oslo_concurrency.lockutils [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.687 2 DEBUG oslo_concurrency.lockutils [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.687 2 DEBUG oslo_concurrency.lockutils [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.687 2 DEBUG nova.compute.manager [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.687 2 DEBUG nova.compute.manager [req-dcdd702f-296c-4c6e-8d5a-292688297c2f req-380d225b-f13f-4a6e-8411-03bb7b7fc489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.722 2 DEBUG nova.network.neutron [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Activated binding for port bb8ecc8e-9cf4-4901-9788-83c49356f983 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.723 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.723 2 DEBUG nova.virt.libvirt.vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:26Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.724 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.724 2 DEBUG nova.network.os_vif_util [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.725 2 DEBUG os_vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb8ecc8e-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.732 2 INFO os_vif [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.733 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.733 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.733 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.733 2 DEBUG nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.734 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deleting instance files /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.735 2 INFO nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deletion of /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del complete
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.888 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.889 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.890 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.930 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.931 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.932 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:17:37 compute-0 nova_compute[192810]: 2025-09-30 21:17:37.932 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:38.721 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:38.722 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:38.723 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:38 compute-0 sshd-session[221877]: Failed password for invalid user treyon from 80.94.95.112 port 63662 ssh2
Sep 30 21:17:39 compute-0 sshd-session[221877]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.448 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.449 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.449 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.450 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.450 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.450 2 WARNING nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.451 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.451 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.451 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.452 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.452 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.452 2 WARNING nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.453 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.453 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.453 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.454 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.454 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.454 2 WARNING nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.455 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.455 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.456 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.456 2 DEBUG oslo_concurrency.lockutils [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.456 2 DEBUG nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.457 2 WARNING nova.compute.manager [req-a07db742-0292-41b4-b75a-5a70cabdc123 req-c88bba3e-5029-480e-8187-427c4df0da29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state active and task_state migrating.
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.554 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.587 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.588 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.589 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:39 compute-0 nova_compute[192810]: 2025-09-30 21:17:39.589 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:17:40 compute-0 sshd-session[221877]: Failed password for invalid user treyon from 80.94.95.112 port 63662 ssh2
Sep 30 21:17:41 compute-0 nova_compute[192810]: 2025-09-30 21:17:41.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:41 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:17:41 compute-0 systemd[221926]: Activating special unit Exit the Session...
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped target Main User Target.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped target Basic System.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped target Paths.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped target Sockets.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped target Timers.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:41 compute-0 systemd[221926]: Closed D-Bus User Message Bus Socket.
Sep 30 21:17:41 compute-0 systemd[221926]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:17:41 compute-0 systemd[221926]: Removed slice User Application Slice.
Sep 30 21:17:41 compute-0 systemd[221926]: Reached target Shutdown.
Sep 30 21:17:41 compute-0 systemd[221926]: Finished Exit the Session.
Sep 30 21:17:41 compute-0 systemd[221926]: Reached target Exit the Session.
Sep 30 21:17:41 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:17:41 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:17:41 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:17:41 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:17:41 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:17:41 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:17:41 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:17:41 compute-0 podman[222109]: 2025-09-30 21:17:41.891689191 +0000 UTC m=+0.105363115 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:17:42 compute-0 sshd-session[221877]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:42 compute-0 nova_compute[192810]: 2025-09-30 21:17:42.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.346 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.346 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.347 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.372 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.373 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.373 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.373 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.465 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.565 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.567 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.648 2 DEBUG oslo_concurrency.processutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.875 2 WARNING nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.876 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5574MB free_disk=73.4336166381836GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.877 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.877 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:43.928 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:43.930 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:17:43 compute-0 nova_compute[192810]: 2025-09-30 21:17:43.965 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration for instance 7f0e9e16-1467-41e5-b5b0-965591aa014c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.011 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.012 2 INFO nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating resource usage from migration 75c88f2a-0d52-4e9d-8432-8454d77751e5
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.048 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration 0a2a09d7-020a-40a6-b7d6-da99acd63254 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.049 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Migration 75c88f2a-0d52-4e9d-8432-8454d77751e5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.050 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.050 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.112 2 DEBUG nova.compute.provider_tree [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.135 2 DEBUG nova.scheduler.client.report [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.186 2 DEBUG nova.compute.resource_tracker [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.186 2 DEBUG oslo_concurrency.lockutils [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.203 2 INFO nova.compute.manager [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.319 2 INFO nova.scheduler.client.report [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Deleted allocation for migration 0a2a09d7-020a-40a6-b7d6-da99acd63254
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.320 2 DEBUG nova.virt.libvirt.driver [None req-247c8cb5-ee8d-4bdc-8950-68e1cdec017f 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.375 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.478 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.480 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:44 compute-0 nova_compute[192810]: 2025-09-30 21:17:44.580 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:44 compute-0 sshd-session[221877]: Failed password for invalid user treyon from 80.94.95.112 port 63662 ssh2
Sep 30 21:17:46 compute-0 nova_compute[192810]: 2025-09-30 21:17:46.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:46 compute-0 podman[222144]: 2025-09-30 21:17:46.350675309 +0000 UTC m=+0.080371591 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:17:46 compute-0 podman[222143]: 2025-09-30 21:17:46.373699144 +0000 UTC m=+0.094978542 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:17:47 compute-0 nova_compute[192810]: 2025-09-30 21:17:47.146 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating tmpfile /var/lib/nova/instances/tmpxbhoey6v to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:17:47 compute-0 nova_compute[192810]: 2025-09-30 21:17:47.148 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:17:47 compute-0 sshd-session[222186]: Accepted publickey for nova from 192.168.122.101 port 50744 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:17:47 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:17:47 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:17:47 compute-0 systemd-logind[792]: New session 32 of user nova.
Sep 30 21:17:47 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:17:47 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:17:47 compute-0 sshd-session[221877]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:47 compute-0 systemd[222190]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:47 compute-0 systemd[222190]: Queued start job for default target Main User Target.
Sep 30 21:17:47 compute-0 systemd[222190]: Created slice User Application Slice.
Sep 30 21:17:47 compute-0 systemd[222190]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:47 compute-0 systemd[222190]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:47 compute-0 systemd[222190]: Reached target Paths.
Sep 30 21:17:47 compute-0 systemd[222190]: Reached target Timers.
Sep 30 21:17:47 compute-0 systemd[222190]: Starting D-Bus User Message Bus Socket...
Sep 30 21:17:47 compute-0 systemd[222190]: Starting Create User's Volatile Files and Directories...
Sep 30 21:17:47 compute-0 systemd[222190]: Finished Create User's Volatile Files and Directories.
Sep 30 21:17:47 compute-0 systemd[222190]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:17:47 compute-0 systemd[222190]: Reached target Sockets.
Sep 30 21:17:47 compute-0 systemd[222190]: Reached target Basic System.
Sep 30 21:17:47 compute-0 systemd[222190]: Reached target Main User Target.
Sep 30 21:17:47 compute-0 systemd[222190]: Startup finished in 190ms.
Sep 30 21:17:47 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:17:47 compute-0 systemd[1]: Started Session 32 of User nova.
Sep 30 21:17:47 compute-0 sshd-session[222186]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:17:47 compute-0 sshd-session[222205]: Received disconnect from 192.168.122.101 port 50744:11: disconnected by user
Sep 30 21:17:47 compute-0 sshd-session[222205]: Disconnected from user nova 192.168.122.101 port 50744
Sep 30 21:17:47 compute-0 sshd-session[222186]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:17:47 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Sep 30 21:17:47 compute-0 systemd-logind[792]: Session 32 logged out. Waiting for processes to exit.
Sep 30 21:17:47 compute-0 systemd-logind[792]: Removed session 32.
Sep 30 21:17:47 compute-0 nova_compute[192810]: 2025-09-30 21:17:47.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:48 compute-0 nova_compute[192810]: 2025-09-30 21:17:48.326 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:17:48 compute-0 nova_compute[192810]: 2025-09-30 21:17:48.390 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:48 compute-0 nova_compute[192810]: 2025-09-30 21:17:48.391 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:48 compute-0 nova_compute[192810]: 2025-09-30 21:17:48.392 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:49 compute-0 sshd-session[221877]: Failed password for invalid user treyon from 80.94.95.112 port 63662 ssh2
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.745 2 DEBUG nova.compute.manager [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.745 2 DEBUG oslo_concurrency.lockutils [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.746 2 DEBUG oslo_concurrency.lockutils [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.746 2 DEBUG oslo_concurrency.lockutils [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.747 2 DEBUG nova.compute.manager [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:50 compute-0 nova_compute[192810]: 2025-09-30 21:17:50.747 2 DEBUG nova.compute.manager [req-f9a43b7b-c53a-498e-b0a4-0f3580a30be7 req-8bfa217c-323f-4d72-91bc-cb88934af980 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.320 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.335 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.354 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.355 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating instance directory: /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.355 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Creating disk.info with the contents: {'/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk': 'qcow2', '/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.356 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.356 2 DEBUG nova.objects.instance [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.398 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 podman[222207]: 2025-09-30 21:17:51.417887779 +0000 UTC m=+0.133456189 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.473 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.475 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.475 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.490 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.566 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.567 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.603 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.605 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.606 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.663 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.665 2 DEBUG nova.virt.disk.api [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Checking if we can resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.666 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.723 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.724 2 DEBUG nova.virt.disk.api [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Cannot resize image /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.725 2 DEBUG nova.objects.instance [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.747 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.775 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.778 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config to /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:17:51 compute-0 nova_compute[192810]: 2025-09-30 21:17:51.778 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:17:51 compute-0 sshd-session[221877]: Received disconnect from 80.94.95.112 port 63662:11: Bye [preauth]
Sep 30 21:17:51 compute-0 sshd-session[221877]: Disconnected from invalid user treyon 80.94.95.112 port 63662 [preauth]
Sep 30 21:17:51 compute-0 sshd-session[221877]: PAM 4 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.112
Sep 30 21:17:51 compute-0 sshd-session[221877]: PAM service(sshd) ignoring max retries; 5 > 3
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.068 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267057.0622323, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.070 2 INFO nova.compute.manager [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Stopped (Lifecycle Event)
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.092 2 DEBUG nova.compute.manager [None req-cfbb7561-df42-425a-bd53-e102cc63bb9f - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.249 2 DEBUG oslo_concurrency.processutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.config /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.250 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.252 2 DEBUG nova.virt.libvirt.vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:17:42Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.253 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.254 2 DEBUG nova.network.os_vif_util [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.255 2 DEBUG os_vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.262 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbb8ecc8e-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.262 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbb8ecc8e-9c, col_values=(('external_ids', {'iface-id': 'bb8ecc8e-9cf4-4901-9788-83c49356f983', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:9d:62', 'vm-uuid': '7f0e9e16-1467-41e5-b5b0-965591aa014c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:52 compute-0 NetworkManager[51733]: <info>  [1759267072.2662] manager: (tapbb8ecc8e-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.273 2 INFO os_vif [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.273 2 DEBUG nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.274 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.487 2 INFO nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Took 7.91 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.488 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.521 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmps3z4e6mm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(75c88f2a-0d52-4e9d-8432-8454d77751e5),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.547 2 DEBUG nova.objects.instance [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lazy-loading 'migration_context' on Instance uuid b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.548 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.550 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.551 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.570 2 DEBUG nova.virt.libvirt.vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:20Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.570 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.571 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.572 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:17:52 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:72:47:71"/>
Sep 30 21:17:52 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:17:52 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:17:52 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:17:52 compute-0 nova_compute[192810]:   <target dev="tap730a9e74-90"/>
Sep 30 21:17:52 compute-0 nova_compute[192810]: </interface>
Sep 30 21:17:52 compute-0 nova_compute[192810]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:17:52 compute-0 nova_compute[192810]: 2025-09-30 21:17:52.573 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.053 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.055 2 INFO nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.184 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.269 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.270 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.271 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.271 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.271 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.271 2 WARNING nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state migrating.
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.271 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-changed-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.272 2 DEBUG nova.compute.manager [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Refreshing instance network info cache due to event network-changed-730a9e74-900e-49b2-a5c3-043d6da1a52b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.272 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.272 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.272 2 DEBUG nova.network.neutron [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Refreshing network info cache for port 730a9e74-900e-49b2-a5c3-043d6da1a52b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:17:53 compute-0 podman[222256]: 2025-09-30 21:17:53.371085033 +0000 UTC m=+0.098895512 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.687 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.688 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:53.932 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.978 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:17:53 compute-0 nova_compute[192810]: 2025-09-30 21:17:53.998 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxbhoey6v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7f0e9e16-1467-41e5-b5b0-965591aa014c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.190 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.191 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:54 compute-0 kernel: tapbb8ecc8e-9c: entered promiscuous mode
Sep 30 21:17:54 compute-0 NetworkManager[51733]: <info>  [1759267074.2850] manager: (tapbb8ecc8e-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Sep 30 21:17:54 compute-0 ovn_controller[94912]: 2025-09-30T21:17:54Z|00064|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this additional chassis.
Sep 30 21:17:54 compute-0 ovn_controller[94912]: 2025-09-30T21:17:54Z|00065|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:54 compute-0 ovn_controller[94912]: 2025-09-30T21:17:54Z|00066|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 ovn-installed in OVS
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:54 compute-0 systemd-machined[152794]: New machine qemu-6-instance-0000000b.
Sep 30 21:17:54 compute-0 systemd-udevd[222290]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:54 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-0000000b.
Sep 30 21:17:54 compute-0 NetworkManager[51733]: <info>  [1759267074.3525] device (tapbb8ecc8e-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:17:54 compute-0 NetworkManager[51733]: <info>  [1759267074.3535] device (tapbb8ecc8e-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.695 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:54 compute-0 nova_compute[192810]: 2025-09-30 21:17:54.697 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.201 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.203 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.629 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267075.629324, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.631 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Paused (Lifecycle Event)
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.656 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.661 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.694 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.706 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Current 50 elapsed 3 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.707 2 DEBUG nova.virt.libvirt.migration [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.718 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267075.7171023, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.718 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Started (Lifecycle Event)
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.752 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:55 compute-0 kernel: tap730a9e74-90 (unregistering): left promiscuous mode
Sep 30 21:17:55 compute-0 NetworkManager[51733]: <info>  [1759267075.7658] device (tap730a9e74-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00067|binding|INFO|Releasing lport 730a9e74-900e-49b2-a5c3-043d6da1a52b from this chassis (sb_readonly=0)
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00068|binding|INFO|Setting lport 730a9e74-900e-49b2-a5c3-043d6da1a52b down in Southbound
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00069|binding|INFO|Releasing lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 from this chassis (sb_readonly=0)
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00070|binding|INFO|Setting lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 down in Southbound
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00071|binding|INFO|Removing iface tap730a9e74-90 ovn-installed in OVS
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.796 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '3', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00072|binding|INFO|Releasing lport 0c66892e-7baf-4f9a-a329-dd0545dbf700 from this chassis (sb_readonly=0)
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00073|binding|INFO|Releasing lport 976cb173-259a-473b-830a-8c627acdbeaf from this chassis (sb_readonly=0)
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.799 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '8', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.801 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 unbound from our chassis
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.804 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.807 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[caf593c6-a4f1-4ab7-b3fb-7285ebe14ee2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.808 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 namespace which is not needed anymore
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:55 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Sep 30 21:17:55 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 14.526s CPU time.
Sep 30 21:17:55 compute-0 systemd-machined[152794]: Machine qemu-5-instance-0000000c terminated.
Sep 30 21:17:55 compute-0 kernel: tap730a9e74-90: entered promiscuous mode
Sep 30 21:17:55 compute-0 NetworkManager[51733]: <info>  [1759267075.9645] manager: (tap730a9e74-90): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00074|binding|INFO|Claiming lport 730a9e74-900e-49b2-a5c3-043d6da1a52b for this chassis.
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00075|binding|INFO|730a9e74-900e-49b2-a5c3-043d6da1a52b: Claiming fa:16:3e:72:47:71 10.100.0.11
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00076|binding|INFO|Claiming lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 for this chassis.
Sep 30 21:17:55 compute-0 ovn_controller[94912]: 2025-09-30T21:17:55Z|00077|binding|INFO|9ece0208-0151-4f0a-bda7-acd45fe4f2a0: Claiming fa:16:3e:28:e2:ac 19.80.0.112
Sep 30 21:17:55 compute-0 nova_compute[192810]: 2025-09-30 21:17:55.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:55 compute-0 kernel: tap730a9e74-90 (unregistering): left promiscuous mode
Sep 30 21:17:55 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [NOTICE]   (221745) : haproxy version is 2.8.14-c23fe91
Sep 30 21:17:55 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [NOTICE]   (221745) : path to executable is /usr/sbin/haproxy
Sep 30 21:17:55 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [WARNING]  (221745) : Exiting Master process...
Sep 30 21:17:55 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [ALERT]    (221745) : Current worker (221747) exited with code 143 (Terminated)
Sep 30 21:17:55 compute-0 neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6[221741]: [WARNING]  (221745) : All workers exited. Exiting... (0)
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:55.999 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '3', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.003 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '8', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:56 compute-0 systemd[1]: libpod-63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92.scope: Deactivated successfully.
Sep 30 21:17:56 compute-0 podman[222345]: 2025-09-30 21:17:56.013028772 +0000 UTC m=+0.081139031 container died 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:17:56 compute-0 virtqemud[192233]: Cannot recv data: Input/output error
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.045 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.045 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.045 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:17:56 compute-0 ovn_controller[94912]: 2025-09-30T21:17:56Z|00078|binding|INFO|Releasing lport 730a9e74-900e-49b2-a5c3-043d6da1a52b from this chassis (sb_readonly=0)
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 ovn_controller[94912]: 2025-09-30T21:17:56Z|00079|binding|INFO|Releasing lport 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 from this chassis (sb_readonly=0)
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-2cbf00f05d659d809e9ba58014e60a790d4d4aa7eb5553158afe4d75e910c6fc-merged.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92-userdata-shm.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.085 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:e2:ac 19.80.0.112'], port_security=['fa:16:3e:28:e2:ac 19.80.0.112'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['730a9e74-900e-49b2-a5c3-043d6da1a52b'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1118581425', 'neutron:cidrs': '19.80.0.112/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1118581425', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '3', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=10e37694-4797-470f-adb8-72e2aa69e8d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ece0208-0151-4f0a-bda7-acd45fe4f2a0) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:56 compute-0 podman[222345]: 2025-09-30 21:17:56.089295328 +0000 UTC m=+0.157405587 container cleanup 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.093 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:47:71 10.100.0.11'], port_security=['fa:16:3e:72:47:71 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-415058274', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16d40025-1087-460f-a42f-c007f6eff406', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-415058274', 'neutron:project_id': '96460712956e4f038121397afa979163', 'neutron:revision_number': '8', 'neutron:security_group_ids': '811ddc34-8450-4370-a409-1146bdb7efe9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7712a78f-5ca7-49dc-980c-dc4049ba5089, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=730a9e74-900e-49b2-a5c3-043d6da1a52b) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:56 compute-0 systemd[1]: libpod-conmon-63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92.scope: Deactivated successfully.
Sep 30 21:17:56 compute-0 podman[222385]: 2025-09-30 21:17:56.167278258 +0000 UTC m=+0.047342493 container remove 63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.178 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[58b0372b-7d52-4443-bf7e-3d0239b671ac]: (4, ('Tue Sep 30 09:17:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 (63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92)\n63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92\nTue Sep 30 09:17:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 (63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92)\n63ce1f36c0ff69954bcb17a5a4b82a65e80ee42bee86fc6a22f1699660d0bf92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.180 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b90e3dd-a852-45cd-a131-2521c642ad8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.181 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a7886e9-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:56 compute-0 kernel: tap4a7886e9-20: left promiscuous mode
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.209 2 DEBUG nova.virt.libvirt.guest [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65' (instance-0000000c) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.209 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migration operation has completed
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.209 2 INFO nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] _post_live_migration() is started..
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.208 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b3d45e-d98f-4005-8d09-5ab4a89efa57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.239 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5c7f68-38f9-48ac-9eb5-0018a6c9c986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.240 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5433dd-0e41-4557-a449-3e1f357767fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.263 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2b7d0e02-02d5-4786-93ef-40e32e9095e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373426, 'reachable_time': 20187, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222402, 'error': None, 'target': 'ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.270 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4a7886e9-2920-46a8-89e7-811c01f2e7c6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.271 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5db502c8-c257-47be-a19e-fa9c3e2ae2b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.273 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.275 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d4a7886e9\x2d2920\x2d46a8\x2d89e7\x2d811c01f2e7c6.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.276 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e15fc93e-2347-4d85-88a5-889f5196a0fa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.277 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 namespace which is not needed anymore
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [NOTICE]   (221837) : haproxy version is 2.8.14-c23fe91
Sep 30 21:17:56 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [NOTICE]   (221837) : path to executable is /usr/sbin/haproxy
Sep 30 21:17:56 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [WARNING]  (221837) : Exiting Master process...
Sep 30 21:17:56 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [ALERT]    (221837) : Current worker (221844) exited with code 143 (Terminated)
Sep 30 21:17:56 compute-0 neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406[221817]: [WARNING]  (221837) : All workers exited. Exiting... (0)
Sep 30 21:17:56 compute-0 systemd[1]: libpod-fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc.scope: Deactivated successfully.
Sep 30 21:17:56 compute-0 podman[222419]: 2025-09-30 21:17:56.438602306 +0000 UTC m=+0.046543783 container died fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc-userdata-shm.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-67493d0b0ffdda01faef13c054f5e134213eb07cfbbef7fe0c824bdf1e0d45fb-merged.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 podman[222419]: 2025-09-30 21:17:56.469845939 +0000 UTC m=+0.077787406 container cleanup fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:56 compute-0 systemd[1]: libpod-conmon-fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc.scope: Deactivated successfully.
Sep 30 21:17:56 compute-0 podman[222446]: 2025-09-30 21:17:56.538225445 +0000 UTC m=+0.047261641 container remove fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.544 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1681b37b-7d0f-4059-8588-24a2758320d3]: (4, ('Tue Sep 30 09:17:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc)\nfb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc\nTue Sep 30 09:17:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 (fb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc)\nfb35cb2fd8a68a330cf84231765cb8fbf7fd5763b5bb2fcfb01eeb22b34720fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.546 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb27ca2-f8a2-4bf0-93a5-7ce447fde78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.547 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16d40025-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 kernel: tap16d40025-10: left promiscuous mode
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.563 2 DEBUG nova.network.neutron [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updated VIF entry in instance network info cache for port 730a9e74-900e-49b2-a5c3-043d6da1a52b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.563 2 DEBUG nova.network.neutron [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Updating instance_info_cache with network_info: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.571 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267076.5708604, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.571 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Resumed (Lifecycle Event)
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.570 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[09e4e505-bed7-4087-98db-d0de19929241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.596 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[94ba3974-7a96-48c0-9ea6-5cc6ca090058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.597 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ee983388-4d0b-48db-92f0-2824ce8fb48f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.615 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2f9f61db-8545-4fa7-870e-bd91b9ca5897]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373532, 'reachable_time': 19183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222469, 'error': None, 'target': 'ovnmeta-16d40025-1087-460f-a42f-c007f6eff406', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.618 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16d40025-1087-460f-a42f-c007f6eff406 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:17:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d16d40025\x2d1087\x2d460f\x2da42f\x2dc007f6eff406.mount: Deactivated successfully.
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.618 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd2cab9-b09b-4a7c-9a0f-f700c87aa465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.620 2 DEBUG oslo_concurrency.lockutils [req-6addfe0d-3549-4177-a387-c2d4b541535b req-7890015d-2784-4276-9bb4-2934937ade8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.620 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 unbound from our chassis
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.622 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.622 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.623 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3017f38f-3b46-44d6-a3db-0ac86cbecf4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.624 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.625 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.625 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c255fc72-3537-4691-963b-0311365d9ef6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.626 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9ece0208-0151-4f0a-bda7-acd45fe4f2a0 in datapath 4a7886e9-2920-46a8-89e7-811c01f2e7c6 unbound from our chassis
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.628 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.627 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a7886e9-2920-46a8-89e7-811c01f2e7c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.630 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2eea83a1-a06b-419f-bacd-19d4053d90e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.631 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 730a9e74-900e-49b2-a5c3-043d6da1a52b in datapath 16d40025-1087-460f-a42f-c007f6eff406 unbound from our chassis
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.632 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16d40025-1087-460f-a42f-c007f6eff406, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:17:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:56.633 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4385b608-4f84-4c84-967b-636c552f44fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:56 compute-0 nova_compute[192810]: 2025-09-30 21:17:56.658 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.279 2 DEBUG nova.network.neutron [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Activated binding for port 730a9e74-900e-49b2-a5c3-043d6da1a52b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.280 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.281 2 DEBUG nova.virt.libvirt.vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:17:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-2078302607',display_name='tempest-LiveMigrationTest-server-2078302607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-livemigrationtest-server-2078302607',id=12,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='96460712956e4f038121397afa979163',ramdisk_id='',reservation_id='r-1871um1z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2029274765',owner_user_name='tempest-LiveMigrationTest-2029274765-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:17:26Z,user_data=None,user_id='4b263d7c3e3141f999e8eabf49e8190c',uuid=b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.282 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converting VIF {"id": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "address": "fa:16:3e:72:47:71", "network": {"id": "16d40025-1087-460f-a42f-c007f6eff406", "bridge": "br-int", "label": "tempest-LiveMigrationTest-27990102-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96460712956e4f038121397afa979163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap730a9e74-90", "ovs_interfaceid": "730a9e74-900e-49b2-a5c3-043d6da1a52b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.283 2 DEBUG nova.network.os_vif_util [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.284 2 DEBUG os_vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.287 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap730a9e74-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.297 2 INFO os_vif [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:47:71,bridge_name='br-int',has_traffic_filtering=True,id=730a9e74-900e-49b2-a5c3-043d6da1a52b,network=Network(16d40025-1087-460f-a42f-c007f6eff406),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap730a9e74-90')
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.298 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.298 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.298 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.299 2 DEBUG nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.300 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Deleting instance files /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65_del
Sep 30 21:17:57 compute-0 nova_compute[192810]: 2025-09-30 21:17:57.301 2 INFO nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Deletion of /var/lib/nova/instances/b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65_del complete
Sep 30 21:17:57 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:17:57 compute-0 systemd[222190]: Activating special unit Exit the Session...
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped target Main User Target.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped target Basic System.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped target Paths.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped target Sockets.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped target Timers.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:17:57 compute-0 systemd[222190]: Closed D-Bus User Message Bus Socket.
Sep 30 21:17:57 compute-0 systemd[222190]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:17:57 compute-0 systemd[222190]: Removed slice User Application Slice.
Sep 30 21:17:57 compute-0 systemd[222190]: Reached target Shutdown.
Sep 30 21:17:57 compute-0 systemd[222190]: Finished Exit the Session.
Sep 30 21:17:57 compute-0 systemd[222190]: Reached target Exit the Session.
Sep 30 21:17:57 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:17:57 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:17:57 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:17:57 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:17:57 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:17:57 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:17:57 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:17:57 compute-0 podman[222472]: 2025-09-30 21:17:57.827486495 +0000 UTC m=+0.062132628 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:17:58 compute-0 sshd-session[222470]: Invalid user juan from 45.81.23.80 port 42572
Sep 30 21:17:58 compute-0 sshd-session[222470]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:17:58 compute-0 sshd-session[222470]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:17:58 compute-0 ovn_controller[94912]: 2025-09-30T21:17:58Z|00080|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this chassis.
Sep 30 21:17:58 compute-0 ovn_controller[94912]: 2025-09-30T21:17:58Z|00081|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:17:58 compute-0 ovn_controller[94912]: 2025-09-30T21:17:58Z|00082|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 up in Southbound
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.058 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '20', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.059 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 bound to our chassis
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.062 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.080 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[923512d0-337c-44ee-9fa8-ba23efa7f402]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.082 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap934fff90-51 in ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.085 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap934fff90-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.085 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[98e6e91d-3ec8-4d5a-bf93-d47259927f05]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.086 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5658b5-0e10-44f0-a66e-3b9b687e1530]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.107 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5c6c99-4a2a-4afa-833b-9636e5ed0688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.134 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29dc799c-7e77-457e-92c1-e2efe9bb09be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.177 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cc24f2c2-8e5e-4404-9c5a-a48c008b6ba8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 NetworkManager[51733]: <info>  [1759267078.1904] manager: (tap934fff90-50): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.185 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[201e21ee-6179-4d9d-a600-098d56b84fcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.240 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[af2010b5-72ab-4107-baf9-7a5ba82b57c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 systemd-udevd[222499]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.246 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6c730a87-2c73-4263-a4d0-2fd8eb035039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 NetworkManager[51733]: <info>  [1759267078.2775] device (tap934fff90-50): carrier: link connected
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.282 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[329354e6-8811-4227-9758-686d81a73d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.304 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1e50f5d5-e822-408f-81fb-c8e01bc85e6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377390, 'reachable_time': 27389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222518, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.328 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[081c3fdf-e37f-4505-92ec-fa603f94ac65]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:3765'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377390, 'tstamp': 377390}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222519, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.351 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[800fbb61-da5e-4785-adb0-a7d94d2819d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377390, 'reachable_time': 27389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222520, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.398 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[38acff1b-8f75-4aef-a089-bbf9b45aaae5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.416 2 INFO nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Post operation of migration started
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.492 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ace127-0fce-4638-bcf3-c9c03f358413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.494 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.494 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.494 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-0 kernel: tap934fff90-50: entered promiscuous mode
Sep 30 21:17:58 compute-0 NetworkManager[51733]: <info>  [1759267078.4978] manager: (tap934fff90-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.503 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-0 ovn_controller[94912]: 2025-09-30T21:17:58Z|00083|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.507 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.508 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec0f44e-fdb8-4f25-8966-b9d8bfb2b92c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.509 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/934fff90-5446-41f1-a5ad-d2568cb337b1.pid.haproxy
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:17:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:17:58.511 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'env', 'PROCESS_TAG=haproxy-934fff90-5446-41f1-a5ad-d2568cb337b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/934fff90-5446-41f1-a5ad-d2568cb337b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.829 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.830 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:17:58 compute-0 nova_compute[192810]: 2025-09-30 21:17:58.830 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:17:58 compute-0 podman[222552]: 2025-09-30 21:17:58.907720369 +0000 UTC m=+0.092248993 container create cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:17:58 compute-0 podman[222552]: 2025-09-30 21:17:58.847192922 +0000 UTC m=+0.031721606 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:17:58 compute-0 systemd[1]: Started libpod-conmon-cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03.scope.
Sep 30 21:17:59 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:17:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37e1a45db0bb6f0533b9c059f707e6cc1ccffdf73c36f7883fb8b3d608953dd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:17:59 compute-0 podman[222552]: 2025-09-30 21:17:59.040283874 +0000 UTC m=+0.224812508 container init cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:17:59 compute-0 podman[222552]: 2025-09-30 21:17:59.046697977 +0000 UTC m=+0.231226591 container start cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:17:59 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [NOTICE]   (222571) : New worker (222573) forked
Sep 30 21:17:59 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [NOTICE]   (222571) : Loading success.
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.358 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.359 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.359 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.360 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.360 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.360 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.361 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.361 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.361 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.361 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.362 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.362 2 WARNING nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.362 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.362 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.363 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.363 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.363 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.363 2 WARNING nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.364 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.364 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.364 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.364 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.365 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.365 2 WARNING nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.365 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.365 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.366 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.366 2 DEBUG oslo_concurrency.lockutils [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.366 2 DEBUG nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.366 2 WARNING nova.compute.manager [req-735b472b-2deb-4927-8d7d-6f9e6717bf8e req-7a1daf54-f094-4ba2-9de3-bb42b354e559 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received unexpected event network-vif-plugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with vm_state active and task_state migrating.
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.446 2 DEBUG nova.compute.manager [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.448 2 DEBUG oslo_concurrency.lockutils [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.448 2 DEBUG oslo_concurrency.lockutils [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.449 2 DEBUG oslo_concurrency.lockutils [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.449 2 DEBUG nova.compute.manager [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] No waiting events found dispatching network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:17:59 compute-0 nova_compute[192810]: 2025-09-30 21:17:59.450 2 DEBUG nova.compute.manager [req-f9b20858-3c36-47c2-af60-4f9d702d8f17 req-f268615f-807b-41bf-913b-c45deaa87a54 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Received event network-vif-unplugged-730a9e74-900e-49b2-a5c3-043d6da1a52b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.102 2 DEBUG nova.network.neutron [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.136 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.188 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.189 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.190 2 DEBUG oslo_concurrency.lockutils [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:00 compute-0 nova_compute[192810]: 2025-09-30 21:18:00.197 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:18:00 compute-0 virtqemud[192233]: Domain id=6 name='instance-0000000b' uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c is tainted: custom-monitor
Sep 30 21:18:00 compute-0 sshd-session[222470]: Failed password for invalid user juan from 45.81.23.80 port 42572 ssh2
Sep 30 21:18:01 compute-0 nova_compute[192810]: 2025-09-30 21:18:01.210 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:18:01 compute-0 sshd-session[222470]: Received disconnect from 45.81.23.80 port 42572:11: Bye Bye [preauth]
Sep 30 21:18:01 compute-0 sshd-session[222470]: Disconnected from invalid user juan 45.81.23.80 port 42572 [preauth]
Sep 30 21:18:01 compute-0 nova_compute[192810]: 2025-09-30 21:18:01.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:02 compute-0 nova_compute[192810]: 2025-09-30 21:18:02.222 2 INFO nova.virt.libvirt.driver [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:18:02 compute-0 nova_compute[192810]: 2025-09-30 21:18:02.231 2 DEBUG nova.compute.manager [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:02 compute-0 nova_compute[192810]: 2025-09-30 21:18:02.253 2 DEBUG nova.objects.instance [None req-2b65aabb-7f19-44d6-b780-c0f5cb24a1d0 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:18:02 compute-0 nova_compute[192810]: 2025-09-30 21:18:02.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.284 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.285 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.285 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.307 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.307 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.308 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.308 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.403 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.488 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.489 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.575 2 DEBUG oslo_concurrency.processutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.749 2 WARNING nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.751 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5579MB free_disk=73.43368530273438GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.752 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.752 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.797 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Applying migration context for instance 7f0e9e16-1467-41e5-b5b0-965591aa014c as it has an incoming, in-progress migration f0ac7ef9-959c-499f-b60b-f101fa97d660. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.797 2 DEBUG nova.objects.instance [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.798 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration for instance b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.817 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.818 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.852 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Migration 75c88f2a-0d52-4e9d-8432-8454d77751e5 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.853 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Instance 7f0e9e16-1467-41e5-b5b0-965591aa014c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.854 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.854 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.945 2 DEBUG nova.compute.provider_tree [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:03 compute-0 nova_compute[192810]: 2025-09-30 21:18:03.962 2 DEBUG nova.scheduler.client.report [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:04 compute-0 nova_compute[192810]: 2025-09-30 21:18:04.001 2 DEBUG nova.compute.resource_tracker [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:18:04 compute-0 nova_compute[192810]: 2025-09-30 21:18:04.001 2 DEBUG oslo_concurrency.lockutils [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:04 compute-0 nova_compute[192810]: 2025-09-30 21:18:04.023 2 INFO nova.compute.manager [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 21:18:04 compute-0 nova_compute[192810]: 2025-09-30 21:18:04.131 2 INFO nova.scheduler.client.report [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] Deleted allocation for migration 75c88f2a-0d52-4e9d-8432-8454d77751e5
Sep 30 21:18:04 compute-0 nova_compute[192810]: 2025-09-30 21:18:04.132 2 DEBUG nova.virt.libvirt.driver [None req-3c88a246-5c30-4754-876a-473060bcf796 bfaa63291eb14024904ed99f97a1648e 992688df688949a8bf284f21c6bff45d - - default default] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:18:05 compute-0 podman[222590]: 2025-09-30 21:18:05.376382205 +0000 UTC m=+0.097268260 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:18:05 compute-0 podman[222589]: 2025-09-30 21:18:05.375446222 +0000 UTC m=+0.095422294 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:18:06 compute-0 nova_compute[192810]: 2025-09-30 21:18:06.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:07 compute-0 nova_compute[192810]: 2025-09-30 21:18:07.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:11 compute-0 nova_compute[192810]: 2025-09-30 21:18:11.022 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267076.0215979, b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:11 compute-0 nova_compute[192810]: 2025-09-30 21:18:11.023 2 INFO nova.compute.manager [-] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] VM Stopped (Lifecycle Event)
Sep 30 21:18:11 compute-0 nova_compute[192810]: 2025-09-30 21:18:11.055 2 DEBUG nova.compute.manager [None req-ad3cc23e-cfd4-45ce-ad87-acf9fa6c3278 - - - - - -] [instance: b1ea0fc5-8bf0-4fc0-913c-73e22a8f0b65] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:11 compute-0 ovn_controller[94912]: 2025-09-30T21:18:11Z|00084|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:18:11 compute-0 nova_compute[192810]: 2025-09-30 21:18:11.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:11 compute-0 nova_compute[192810]: 2025-09-30 21:18:11.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-0 nova_compute[192810]: 2025-09-30 21:18:12.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:12 compute-0 podman[222634]: 2025-09-30 21:18:12.365543165 +0000 UTC m=+0.092312225 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:18:16 compute-0 nova_compute[192810]: 2025-09-30 21:18:16.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:17 compute-0 nova_compute[192810]: 2025-09-30 21:18:17.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:17 compute-0 podman[222656]: 2025-09-30 21:18:17.33931107 +0000 UTC m=+0.075343694 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:18:17 compute-0 podman[222655]: 2025-09-30 21:18:17.354420654 +0000 UTC m=+0.084534937 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:18:21 compute-0 nova_compute[192810]: 2025-09-30 21:18:21.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-0 nova_compute[192810]: 2025-09-30 21:18:22.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:22 compute-0 podman[222700]: 2025-09-30 21:18:22.364783159 +0000 UTC m=+0.097982898 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:18:24 compute-0 podman[222726]: 2025-09-30 21:18:24.354598424 +0000 UTC m=+0.084080405 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:18:26 compute-0 nova_compute[192810]: 2025-09-30 21:18:26.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:27 compute-0 nova_compute[192810]: 2025-09-30 21:18:27.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:27 compute-0 nova_compute[192810]: 2025-09-30 21:18:27.674 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating tmpfile /var/lib/nova/instances/tmpgbk0jjex to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Sep 30 21:18:27 compute-0 nova_compute[192810]: 2025-09-30 21:18:27.675 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Sep 30 21:18:28 compute-0 podman[222746]: 2025-09-30 21:18:28.315359443 +0000 UTC m=+0.054211667 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:18:29 compute-0 nova_compute[192810]: 2025-09-30 21:18:29.169 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Sep 30 21:18:29 compute-0 nova_compute[192810]: 2025-09-30 21:18:29.197 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:29 compute-0 nova_compute[192810]: 2025-09-30 21:18:29.198 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:29 compute-0 nova_compute[192810]: 2025-09-30 21:18:29.198 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.148 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.165 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.175 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.175 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating instance directory: /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.176 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Creating disk.info with the contents: {'/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk': 'qcow2', '/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.176 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.177 2 DEBUG nova.objects.instance [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.199 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.277 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.278 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.279 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.290 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.364 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.365 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.406 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.407 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.408 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.468 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.471 2 DEBUG nova.virt.disk.api [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Checking if we can resize image /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.471 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.538 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.540 2 DEBUG nova.virt.disk.api [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Cannot resize image /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.541 2 DEBUG nova.objects.instance [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lazy-loading 'migration_context' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.560 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.603 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config 485376" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.606 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config to /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:18:31 compute-0 nova_compute[192810]: 2025-09-30 21:18:31.606 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.173 2 DEBUG oslo_concurrency.processutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.config /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.174 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.177 2 DEBUG nova.virt.libvirt.vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:18:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:18:23Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.178 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.179 2 DEBUG nova.network.os_vif_util [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.180 2 DEBUG os_vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.183 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.183 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.187 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6fd9c21-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6fd9c21-8c, col_values=(('external_ids', {'iface-id': 'c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:04:bd', 'vm-uuid': '800f4413-c978-4c4e-97b6-1ea1e45f9f17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:32 compute-0 NetworkManager[51733]: <info>  [1759267112.1921] manager: (tapc6fd9c21-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.203 2 INFO os_vif [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c')
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.203 2 DEBUG nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Sep 30 21:18:32 compute-0 nova_compute[192810]: 2025-09-30 21:18:32.204 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Sep 30 21:18:33 compute-0 nova_compute[192810]: 2025-09-30 21:18:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:33 compute-0 nova_compute[192810]: 2025-09-30 21:18:33.998 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Sep 30 21:18:34 compute-0 nova_compute[192810]: 2025-09-30 21:18:34.015 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgbk0jjex',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='800f4413-c978-4c4e-97b6-1ea1e45f9f17',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Sep 30 21:18:34 compute-0 kernel: tapc6fd9c21-8c: entered promiscuous mode
Sep 30 21:18:34 compute-0 NetworkManager[51733]: <info>  [1759267114.2925] manager: (tapc6fd9c21-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Sep 30 21:18:34 compute-0 ovn_controller[94912]: 2025-09-30T21:18:34Z|00085|binding|INFO|Claiming lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for this additional chassis.
Sep 30 21:18:34 compute-0 ovn_controller[94912]: 2025-09-30T21:18:34Z|00086|binding|INFO|c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7: Claiming fa:16:3e:82:04:bd 10.100.0.4
Sep 30 21:18:34 compute-0 ovn_controller[94912]: 2025-09-30T21:18:34Z|00087|binding|INFO|Claiming lport 7ce813d5-5e22-4733-a15e-f9d5223072eb for this additional chassis.
Sep 30 21:18:34 compute-0 ovn_controller[94912]: 2025-09-30T21:18:34Z|00088|binding|INFO|7ce813d5-5e22-4733-a15e-f9d5223072eb: Claiming fa:16:3e:dd:ce:7d 19.80.0.205
Sep 30 21:18:34 compute-0 nova_compute[192810]: 2025-09-30 21:18:34.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:34 compute-0 systemd-udevd[222801]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:18:34 compute-0 systemd-machined[152794]: New machine qemu-7-instance-0000000f.
Sep 30 21:18:34 compute-0 NetworkManager[51733]: <info>  [1759267114.4041] device (tapc6fd9c21-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:18:34 compute-0 ovn_controller[94912]: 2025-09-30T21:18:34Z|00089|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 ovn-installed in OVS
Sep 30 21:18:34 compute-0 nova_compute[192810]: 2025-09-30 21:18:34.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:34 compute-0 NetworkManager[51733]: <info>  [1759267114.4048] device (tapc6fd9c21-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:18:34 compute-0 nova_compute[192810]: 2025-09-30 21:18:34.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:34 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000f.
Sep 30 21:18:34 compute-0 nova_compute[192810]: 2025-09-30 21:18:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:35 compute-0 nova_compute[192810]: 2025-09-30 21:18:35.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:35 compute-0 nova_compute[192810]: 2025-09-30 21:18:35.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.094 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267116.0942779, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.095 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Started (Lifecycle Event)
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.130 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:36 compute-0 podman[222827]: 2025-09-30 21:18:36.334310016 +0000 UTC m=+0.065162935 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:18:36 compute-0 podman[222829]: 2025-09-30 21:18:36.342123694 +0000 UTC m=+0.072354727 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.816 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.875 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267116.8751469, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.876 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Resumed (Lifecycle Event)
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.903 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.908 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.931 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.951 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.972 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:36 compute-0 nova_compute[192810]: 2025-09-30 21:18:36.973 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.031 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.038 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.096 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.097 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.153 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.299 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.301 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5445MB free_disk=73.40480041503906GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.301 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.301 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.348 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration for instance 800f4413-c978-4c4e-97b6-1ea1e45f9f17 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.368 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating resource usage from migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.369 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Starting to track incoming migration 2385dcbc-a0c6-42bb-b744-9f34ef959fad with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.414 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 7f0e9e16-1467-41e5-b5b0-965591aa014c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.440 2 WARNING nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 800f4413-c978-4c4e-97b6-1ea1e45f9f17 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.440 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.440 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.498 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.516 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.545 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:18:37 compute-0 nova_compute[192810]: 2025-09-30 21:18:37.545 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00090|binding|INFO|Claiming lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for this chassis.
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00091|binding|INFO|c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7: Claiming fa:16:3e:82:04:bd 10.100.0.4
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00092|binding|INFO|Claiming lport 7ce813d5-5e22-4733-a15e-f9d5223072eb for this chassis.
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00093|binding|INFO|7ce813d5-5e22-4733-a15e-f9d5223072eb: Claiming fa:16:3e:dd:ce:7d 19.80.0.205
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00094|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 up in Southbound
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00095|binding|INFO|Setting lport 7ce813d5-5e22-4733-a15e-f9d5223072eb up in Southbound
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.222 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ce:7d 19.80.0.205'], port_security=['fa:16:3e:dd:ce:7d 19.80.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-521832462', 'neutron:cidrs': '19.80.0.205/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-383de3b7-a202-49cd-b129-b065ac294878', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-521832462', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d890b95b-932f-4c74-902c-ed705814144b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ce813d5-5e22-4733-a15e-f9d5223072eb) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.224 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:04:bd 10.100.0.4'], port_security=['fa:16:3e:82:04:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-280178662', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-280178662', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.226 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce813d5-5e22-4733-a15e-f9d5223072eb in datapath 383de3b7-a202-49cd-b129-b065ac294878 bound to our chassis
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.227 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.246 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2387bb6-e132-4faa-af84-7d95c6733483]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.247 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap383de3b7-a1 in ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.251 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap383de3b7-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.251 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ef77a8ca-9867-4257-bba7-2ae5a06dc321]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.252 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[abd3b73f-fee5-450c-8437-d8b422e7f27d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.267 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[3aaab888-2481-44fa-8aea-11c983c88f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.292 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6886679e-ab29-4fa3-b88c-b91590f79395]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.324 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f11c2824-bc7e-4f61-823d-7032142bc629]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 NetworkManager[51733]: <info>  [1759267118.3332] manager: (tap383de3b7-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.332 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a2402060-25ea-443f-be4c-6f8cefdd6cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 systemd-udevd[222897]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.367 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ceeabf-bb56-456d-b1b4-5753f64e86aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.370 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[04565f81-f08c-4ecf-a92c-1d8d3df94c88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 NetworkManager[51733]: <info>  [1759267118.3996] device (tap383de3b7-a0): carrier: link connected
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.405 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[baaaed37-55ae-47fa-bc1b-33f1e6c646e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.425 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a771e57c-3ea6-44d6-9a65-12361a4559c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap383de3b7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:58:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381402, 'reachable_time': 23043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222916, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.448 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7bcbd7-4c64-427c-a365-0f4841f67708]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febe:58ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381402, 'tstamp': 381402}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222917, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.472 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3fea35a1-4c10-4d70-b018-0c123aee931e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap383de3b7-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:58:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381402, 'reachable_time': 23043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222918, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.506 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9837c505-6366-41cd-b93a-a66ba914f437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.584 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6f52dca2-e211-44b1-930e-67c29d728d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.586 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap383de3b7-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap383de3b7-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:38 compute-0 kernel: tap383de3b7-a0: entered promiscuous mode
Sep 30 21:18:38 compute-0 NetworkManager[51733]: <info>  [1759267118.5908] manager: (tap383de3b7-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Sep 30 21:18:38 compute-0 nova_compute[192810]: 2025-09-30 21:18:38.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.594 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap383de3b7-a0, col_values=(('external_ids', {'iface-id': 'd9318b69-dfe5-4e14-b762-51aa441e80cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:38 compute-0 ovn_controller[94912]: 2025-09-30T21:18:38Z|00096|binding|INFO|Releasing lport d9318b69-dfe5-4e14-b762-51aa441e80cd from this chassis (sb_readonly=0)
Sep 30 21:18:38 compute-0 nova_compute[192810]: 2025-09-30 21:18:38.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.598 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.599 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccda7a8-bce9-4442-adfa-0b343c0824e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.601 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/383de3b7-a202-49cd-b129-b065ac294878.pid.haproxy
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 383de3b7-a202-49cd-b129-b065ac294878
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.602 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'env', 'PROCESS_TAG=haproxy-383de3b7-a202-49cd-b129-b065ac294878', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/383de3b7-a202-49cd-b129-b065ac294878.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:18:38 compute-0 nova_compute[192810]: 2025-09-30 21:18:38.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:38 compute-0 nova_compute[192810]: 2025-09-30 21:18:38.687 2 INFO nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Post operation of migration started
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.722 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.723 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:38.724 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:38 compute-0 podman[222951]: 2025-09-30 21:18:38.992146519 +0000 UTC m=+0.048729218 container create 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:18:39 compute-0 systemd[1]: Started libpod-conmon-39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11.scope.
Sep 30 21:18:39 compute-0 podman[222951]: 2025-09-30 21:18:38.968751515 +0000 UTC m=+0.025334244 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:18:39 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:18:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/524b9ff0d2ebef7787c8e47a5caea544112fbe1d200a5a48fb1923136ecbda60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:18:39 compute-0 podman[222951]: 2025-09-30 21:18:39.088100395 +0000 UTC m=+0.144683104 container init 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:18:39 compute-0 podman[222951]: 2025-09-30 21:18:39.093167144 +0000 UTC m=+0.149749853 container start 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:18:39 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [NOTICE]   (222970) : New worker (222972) forked
Sep 30 21:18:39 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [NOTICE]   (222970) : Loading success.
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.160 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.161 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.180 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[28f878f4-e3a3-4620-82cb-3d1715be6053]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.218 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[01401d6d-8e4c-435a-b5b1-c8bcb4b813b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.225 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9ebba40e-6fb1-48ad-85ef-457ef97ac1f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.268 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9287c2ec-5636-40cf-b3e6-9c76fa89dea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.297 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5acee6fd-58ab-4b60-a2dc-d07c75e66203]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 1462, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 1462, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377390, 'reachable_time': 27389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222986, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.324 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[70e902b1-e808-4542-b4b9-2b102a748d81]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap934fff90-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377406, 'tstamp': 377406}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222987, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap934fff90-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377410, 'tstamp': 377410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222987, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.327 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.373 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.373 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquired lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.373 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.375 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.375 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.376 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:39.376 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.546 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.546 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.731 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.731 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:18:39 compute-0 nova_compute[192810]: 2025-09-30 21:18:39.731 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.199 2 DEBUG nova.network.neutron [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [{"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.223 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Releasing lock "refresh_cache-800f4413-c978-4c4e-97b6-1ea1e45f9f17" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.250 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.251 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.251 2 DEBUG oslo_concurrency.lockutils [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.255 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 21:18:41 compute-0 virtqemud[192233]: Domain id=7 name='instance-0000000f' uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17 is tainted: custom-monitor
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.765 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [{"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.784 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-7f0e9e16-1467-41e5-b5b0-965591aa014c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.784 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:18:41 compute-0 nova_compute[192810]: 2025-09-30 21:18:41.785 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:18:42 compute-0 nova_compute[192810]: 2025-09-30 21:18:42.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:42 compute-0 nova_compute[192810]: 2025-09-30 21:18:42.265 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 21:18:43 compute-0 nova_compute[192810]: 2025-09-30 21:18:43.272 2 INFO nova.virt.libvirt.driver [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 21:18:43 compute-0 nova_compute[192810]: 2025-09-30 21:18:43.278 2 DEBUG nova.compute.manager [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:18:43 compute-0 nova_compute[192810]: 2025-09-30 21:18:43.310 2 DEBUG nova.objects.instance [None req-b985a38b-865d-45f3-bede-da8dc8d5964c 057daf975994458c9355cedaa235e785 8eab5a769eb24f94a002638bfc9f8925 - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:18:43 compute-0 podman[222988]: 2025-09-30 21:18:43.362911427 +0000 UTC m=+0.097156657 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.907 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '544a33c53701466d8bf7e8ed34f38dcb', 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'hostId': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000f', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '544a33c53701466d8bf7e8ed34f38dcb', 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'hostId': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>]
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.932 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.933 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.955 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.requests volume: 2 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.956 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '814fabcb-5afa-4038-8f22-63a0c69662f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:43.912133', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b601c36-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '7ff00d2c5bc1ed90647ebeb6e6d6e764231bec77667cc72bd2b39d91b432adeb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:43.912133', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b602cb2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '66a082415d6ed91f340c547da6fe17e7658601571c0987bf1286c30d4852591d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 2, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:43.912133', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b6398d4-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '906e5f88ab2c54c103a4c1182c64c9e5104ed899d4f153622d087940b0803950'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:43.912133', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b63a6f8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '54c24251e1fb0e24b3fd8922e1251e44bafc48752dd57c5a5fabbb26b6c5b1eb'}]}, 'timestamp': '2025-09-30 21:18:43.956458', '_unique_id': 'c98cf57816ca404d911e863cfc3b66b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.958 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.962 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7f0e9e16-1467-41e5-b5b0-965591aa014c / tapbb8ecc8e-9c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.962 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.incoming.bytes volume: 1678 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.964 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 800f4413-c978-4c4e-97b6-1ea1e45f9f17 / tapc6fd9c21-8c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.965 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.incoming.bytes volume: 1010 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd396bb0b-c4ac-4a58-9ce1-a4da65957f75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1678, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:43.958932', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b64abe8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '9104f3c223fbe3849333546df8f2bb8c7be58f193612186bd0f1102fb71c723d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1010, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:43.958932', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b650462-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '1f45bed0a1ef16b51f84b55c9493d1b6a44a3ef6d9bb6dc24af4c1f4c6d805b0'}]}, 'timestamp': '2025-09-30 21:18:43.965381', '_unique_id': '990203e7096048c5a193a19459d07cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.967 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.967 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.967 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d964588-cb96-4968-8f3b-7c00db7c6c09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:43.967191', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b655548-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '31457efa2a753fccab281d6239917e7a6722f257830ce3dcd686bfb5de217502'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:43.967191', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b655d90-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '42a4262a3f124ed98ff693a96e1e2034342461dc56f320d8af9453b1f003308e'}]}, 'timestamp': '2025-09-30 21:18:43.967647', '_unique_id': 'a09dd994cd714c06930d67faba38d9c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.outgoing.packets volume: 80 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.968 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.outgoing.packets volume: 74 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5280b6f1-a8bb-4d51-973c-3903d73f3a7c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 80, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:43.968729', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6590f8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '0cdc7e83a73371594ea1ad32b4056e74592ccb5144a9c5101d5389317e9bf7eb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 74, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:43.968729', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b65992c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '0675819cc1679e125b38ba779d1a6a13cd175e7e42779f54bafd299d3bfd1f32'}]}, 'timestamp': '2025-09-30 21:18:43.969190', '_unique_id': '5cbf68bf828e4e529cec91cf934598c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.969 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.970 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.970 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.970 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>]
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.970 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:18:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:43.984 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/cpu volume: 280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.004 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/cpu volume: 330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '848f3dcb-868d-4d24-9969-97ffd94509fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 280000000, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'timestamp': '2025-09-30T21:18:43.970599', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0b67ea60-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.671405824, 'message_signature': 'e9a2100b765235ad5ffa296a7bc54bdbaf45120d04405be142bed71c3fb6a240'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 330000000, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'timestamp': '2025-09-30T21:18:43.970599', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0b6b14ce-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.692174251, 'message_signature': '135b190c372b231b6e803e645a685149bfa558d50f30814130bafb83415ff355'}]}, 'timestamp': '2025-09-30 21:18:44.005136', '_unique_id': '9f79245795d440c1a4281a2b57fc436e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.incoming.packets volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.006 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5873dc0-c5c1-4be7-b06c-8a4468ab024f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 27, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.006745', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6b5e5c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '6e276ba759d23ee941af5f60593bea38b1512fd7022a935a11ed2baec96bd966'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.006745', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b6b667c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '952c1eaf0bedbb671e949e3f7023e0faa72e849f681dc3b71ed451e42ab5f29c'}]}, 'timestamp': '2025-09-30 21:18:44.007179', '_unique_id': '9dd59ef17e644a9b9ead8e734bc3c9fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.008 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.008 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.008 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13ff1d5b-ba53-43aa-af7a-1bfbd2b7e64a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.008221', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6b978c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '92db5f7e5541c653f5cb9bc837b23b57c45628f6e959f407065da6874acb209b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.008221', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b6ba0a6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '6fae8011cfee8c5155478d743b4e32ad9c0377b912fb8661bc123b008a474b2c'}]}, 'timestamp': '2025-09-30 21:18:44.008668', '_unique_id': 'd0f6d31fddb9441bb71c17447687c521'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03a7bb79-be7c-40f0-bf07-f4f81ffe3baf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.010018', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6bddd2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': 'bb4eca0767201e4303b01193109655071da3ef079d2a329a5eda204be38a3729'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.010018', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b6be5f2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '6eabc2fbdacdf3d5d4286dda92139917d0fbe6d65304facb972dc3cca1ecb191'}]}, 'timestamp': '2025-09-30 21:18:44.010441', '_unique_id': '0e3edc6c26a7451dafe410bdd0d07800'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.011 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.outgoing.bytes volume: 5560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.011 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.outgoing.bytes volume: 5174 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ab37d2a-7c48-4492-ac28-18247bc60f10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5560, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.011475', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6c173e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '087e1095a94530f75ea3da1ce145e777898b7bcaf5e8b6930e4332cae4411d49'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5174, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.011475', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b6c1f5e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '6e748b92b87684c2e6e74e3f0713ff0832aa377f3485179b30d75a025c479489'}]}, 'timestamp': '2025-09-30 21:18:44.011912', '_unique_id': '36e622626a9d40eab8c503b0fe7ae998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.012 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.021 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.022 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.034 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.034 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9589ff5c-fd8c-42d5-a865-b6b50702776e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.013002', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b6db116-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '028e13118452800c3af5ca0ff58673b994ea25dc54d2259fe71e45507119ce0f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.013002', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b6db9a4-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '98491430fdc2070731348b07bc5b9e493ba99d478893350a97390c0950f9f288'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.013002', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b6f8e82-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': '6fca0ce05e4fa809caece6548c9db8e532109b542ae0bba0a4495fcc28e5c255'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.013002', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b6f9670-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': '2f48efbb4e38ada1de17b288bebcb4260e59c0df4d5e683bdc51a141161a2782'}]}, 'timestamp': '2025-09-30 21:18:44.034645', '_unique_id': '410f2faf3df1422da5c19417fbd22a61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.035 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.036 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.036 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf8c9d50-256d-40f8-b159-e0a99b2e12f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.036294', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b6fe094-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': '0d333d21bbdc0eccddf04ea693a42a6efc9856587c3ef1e183e77a19419cbf05'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.036294', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b6fea3a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '60adf63c0106d8ca768dfb57635f4e5b2c2e2fd480b49e1f61ffe4cb2afdee2b'}]}, 'timestamp': '2025-09-30 21:18:44.036766', '_unique_id': '31017bc4010549dd99512a8315a5d5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.037 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.038 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.038 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.latency volume: 1498449 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.038 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '896d9202-461c-4463-ab55-974db6285edf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.037921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b701faa-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '80db4d0efb0b38817569104f2d96e2b260e22db0ed7d182b0ddf305fcf6758b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.037921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b70293c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': 'd1b74bf9a9f7329d37bf6407b3d597ff1d2a0048ce7ccd20a79665ad5f423f9e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1498449, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.037921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b70308a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '2d28be1412507efedf8b3a83c525f55d7ba58ee59033df71a6bec47dcae4d87b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.037921', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b70388c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '6eb15a2d0054b95794b25decf57feef8e3dc669c617f074a6c478d9dcd0b157c'}]}, 'timestamp': '2025-09-30 21:18:44.038760', '_unique_id': 'b69a3093c32941bb9bc2a14a4b034021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.039 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.040 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.040 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.bytes volume: 8192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.040 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03c0c976-7ad1-41aa-a88e-71cb5a56b50d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.039890', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b706c80-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': 'b3f416e7f85d6a6152c731f2207f67ff5a56a9672c24cf5d0953e2b1ed5e26e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.039890', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b707676-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '8527096d579104629dd253784a11c04bed9a8a87c313e62a3e08e0eb49f783b5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8192, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.039890', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b707dc4-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': 'e3019cfd91d9a681c82fabc91ef613791759b1b89d905c1ed3dacffe1be64295'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.039890', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b70862a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '34bfccb067843468cbeb1ac7cf89b8b3681517d35cd84af44cf030f7427be181'}]}, 'timestamp': '2025-09-30 21:18:44.040748', '_unique_id': 'b3620f8f9bc24e0395df35aa7fecd241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.041 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.bytes volume: 126976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.042 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.042 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.bytes volume: 151552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.042 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82dfcb2a-a6c8-467e-b435-80d23aea3c4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 126976, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.041899', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b70bafa-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '4a0d8cffb2d494f4a153d1fe9eb9063c93a04ebc621ac801e2fe162b60965865'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.041899', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b70c450-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '02815993ef6ac2c12dd9670a1b71b7df812ecb1d98c999269aaf0a593a273f9b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 151552, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.041899', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b70cb9e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': 'abd83b90d0ad6ee11592a3218cf8525f0d337386e6980462dbcad7e40dca1b89'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.041899', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b70d36e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '9ebcaa9edaa36d5453bdd3e9f982c79b095a345b242327583701009792bc51e9'}]}, 'timestamp': '2025-09-30 21:18:44.042725', '_unique_id': '9974f99b27ae472f9ea61e91f24188cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.043 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6eb50b2a-a6ec-44bf-8775-8397f73c6f47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.043877', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b71085c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': 'd015b5c39e71184c7136abeda599c44fc613dc3013c15c3fa60171deac3a55de'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.043877', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b7111b2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': 'f66fe0658d27a438990c8e76c9e8ef870d68c9b667fe87f7038ee353cfa2c545'}]}, 'timestamp': '2025-09-30 21:18:44.044333', '_unique_id': '1a804b7bd97849d7a10efe3648afba37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.045 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.requests volume: 19 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.045 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.045 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.requests volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44640563-d7c5-457c-9052-a6db24f12e66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 19, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.045428', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b7144b6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '2b40c95b769d0b0f3833f2e341c28753657327190eeecebe38619694729b7296'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.045428', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b714ce0-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '67395d421f2400c8dc5f3c49576872184e1cfd535c0e7f332da93393baef7ab4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 6, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.045428', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b7153fc-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '56928d6c6ed174343b854d52dfd64bedc372791077e5e659a9a5b75f71cca44d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.045428', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b715adc-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': 'aa240cf7ddfb3c121ef3280cc035145558aacf98dc03ccd569844e888bb13c2f'}]}, 'timestamp': '2025-09-30 21:18:44.046189', '_unique_id': 'a65dcb9f8d294a86891c2801d4fec835'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.047 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.047 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.047 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.047 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44d15318-fe21-4b0a-b2f8-3ba9b1c8ec3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.047354', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b718ffc-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '98817eb8e198e17ff8b0a7c08f7cecfb1fbc9f150cc8e2dcd268f58d61e3ff5f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.047354', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b719876-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '592d09d81ef7f0b7aca5e662e343400e1e4adbc597994c1668c3ed6ae36c7d39'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.047354', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b719f7e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': 'b443e699cbefdc4bf0f02c3169cb562b86b51c44b93272119805ab1afe68a333'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.047354', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b71a65e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': 'b36e901c078744aa0f0bf166e64f8aff519ec61d4733c3ccf1873f7e11cc7a7f'}]}, 'timestamp': '2025-09-30 21:18:44.048123', '_unique_id': '11bc6684fce342beb9082831cfdfe954'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.049 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.allocation volume: 30613504 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.049 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.049 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae124fbb-d01a-4d06-94d7-05e3b4cfa791', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30613504, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.049418', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b71e0a6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '1d5e3e4c0be7d486665e65de5028982a7ae45004d3a6a3e04e2e55df6a4be129'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.049418', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b71e8c6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.700498572, 'message_signature': '6089c45569bd0860c6747e88d563809d04bf02df64db371c5c70c2ab9e03a18c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.049418', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b71efd8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': 'a99972236edc0517126a1678bd1f495d1e777638ffd5e9e93d4a2de487af43cc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.049418', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b71f6b8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.709886901, 'message_signature': 'ccbb27fe3d567c7273b99616da1edd4e8b9e7ed83fc4bf3335ee6e0ba060133b'}]}, 'timestamp': '2025-09-30 21:18:44.050179', '_unique_id': 'd6e941e121ee48e992b08e2c83c4bbf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.051 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.051 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/memory.usage volume: 46.31640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.051 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c833627c-3fce-4fa2-a9dd-db0eec467314', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.31640625, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'timestamp': '2025-09-30T21:18:44.051377', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0b722d9a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.671405824, 'message_signature': '6a8759fc96ffe77959b916477ba4312f8a1938ffdcabd42366d7938bf33f53f0'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'timestamp': '2025-09-30T21:18:44.051377', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0b7235e2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.692174251, 'message_signature': '004029db196cb33f7735e7250db9ae951b3ea24ac84dbc6661999874d880576f'}]}, 'timestamp': '2025-09-30 21:18:44.051798', '_unique_id': '21bdfd9f8a494c1f9b226779b1adcd11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.052 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.053 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.053 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>]
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.053 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.053 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.053 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7695bcd3-9458-46d0-ba01-6a4c92464a76', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000b-7f0e9e16-1467-41e5-b5b0-965591aa014c-tapbb8ecc8e-9c', 'timestamp': '2025-09-30T21:18:44.053441', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'tapbb8ecc8e-9c', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:9d:62', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbb8ecc8e-9c'}, 'message_id': '0b727e9e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.64645816, 'message_signature': 'ea43b6396164779e2af55410d2d07ec83f0c4f7a1a3a76e4b049f5f573d075be'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': 'instance-0000000f-800f4413-c978-4c4e-97b6-1ea1e45f9f17-tapc6fd9c21-8c', 'timestamp': '2025-09-30T21:18:44.053441', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'tapc6fd9c21-8c', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:82:04:bd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc6fd9c21-8c'}, 'message_id': '0b7286e6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.650689148, 'message_signature': '2cacb9cf2382f2a44af2568f252b953339351453c7fd012f63f6f99952fca64e'}]}, 'timestamp': '2025-09-30 21:18:44.053901', '_unique_id': 'a5629cd970954ad393926df60da1a62f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.054 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-772926731>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-943144079>]
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.latency volume: 39756070 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 DEBUG ceilometer.compute.pollsters [-] 7f0e9e16-1467-41e5-b5b0-965591aa014c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.055 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.latency volume: 4854364 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 DEBUG ceilometer.compute.pollsters [-] 800f4413-c978-4c4e-97b6-1ea1e45f9f17/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f81b0b3-5943-40cd-9d8c-2886d7fe03a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 39756070, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-vda', 'timestamp': '2025-09-30T21:18:44.055407', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b72cd54-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '7f842d96eea1ed60ef2af1ed67ee8d9ac6fb8018b87db6903736cda401893daf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c-sda', 'timestamp': '2025-09-30T21:18:44.055407', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-772926731', 'name': 'instance-0000000b', 'instance_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b72d52e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.599629842, 'message_signature': '79793731a767ee408470b8bfa17c67ef4e713ad9ce559a1882eaff4cb055e57a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4854364, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-vda', 'timestamp': '2025-09-30T21:18:44.055407', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0b72dc7c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '687d12605b9c5e5b65abbae6b6335c4d6dc769981957505ea3e38458c5bf40e5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '981e96ea2bc2419d9a1e57d6aed70304', 'user_name': None, 'project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'project_name': None, 'resource_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17-sda', 'timestamp': '2025-09-30T21:18:44.055407', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-943144079', 'name': 'instance-0000000f', 'instance_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'instance_type': 'm1.nano', 'host': '652b45a73b242590292b4b3b290d94810d4024484ace0054fe0ab1ff', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0b72e438-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3819.621185899, 'message_signature': '3c18d64eb09e4f03845769b610fe5c49c5fa7907222f4da520e1c3581177add4'}]}, 'timestamp': '2025-09-30 21:18:44.056266', '_unique_id': 'a0c532994e0c4731a081969c43d5fedb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:18:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:18:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.787 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.788 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.789 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.789 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.790 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.806 2 INFO nova.compute.manager [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Terminating instance
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.821 2 DEBUG nova.compute.manager [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:18:45 compute-0 kernel: tapc6fd9c21-8c (unregistering): left promiscuous mode
Sep 30 21:18:45 compute-0 NetworkManager[51733]: <info>  [1759267125.8595] device (tapc6fd9c21-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00097|binding|INFO|Releasing lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 from this chassis (sb_readonly=0)
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00098|binding|INFO|Setting lport c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 down in Southbound
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00099|binding|INFO|Releasing lport 7ce813d5-5e22-4733-a15e-f9d5223072eb from this chassis (sb_readonly=0)
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00100|binding|INFO|Setting lport 7ce813d5-5e22-4733-a15e-f9d5223072eb down in Southbound
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00101|binding|INFO|Removing iface tapc6fd9c21-8c ovn-installed in OVS
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00102|binding|INFO|Releasing lport d9318b69-dfe5-4e14-b762-51aa441e80cd from this chassis (sb_readonly=0)
Sep 30 21:18:45 compute-0 ovn_controller[94912]: 2025-09-30T21:18:45Z|00103|binding|INFO|Releasing lport b21a7164-770c-4265-ad15-a3e058ec1a56 from this chassis (sb_readonly=0)
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.887 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:ce:7d 19.80.0.205'], port_security=['fa:16:3e:dd:ce:7d 19.80.0.205'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-521832462', 'neutron:cidrs': '19.80.0.205/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-383de3b7-a202-49cd-b129-b065ac294878', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-521832462', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d890b95b-932f-4c74-902c-ed705814144b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7ce813d5-5e22-4733-a15e-f9d5223072eb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.889 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:04:bd 10.100.0.4'], port_security=['fa:16:3e:82:04:bd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-280178662', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '800f4413-c978-4c4e-97b6-1ea1e45f9f17', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-280178662', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.890 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce813d5-5e22-4733-a15e-f9d5223072eb in datapath 383de3b7-a202-49cd-b129-b065ac294878 unbound from our chassis
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.891 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 383de3b7-a202-49cd-b129-b065ac294878, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.892 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6c6c16df-e572-4b27-9d20-800e6a8beca1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:45.893 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 namespace which is not needed anymore
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:45 compute-0 nova_compute[192810]: 2025-09-30 21:18:45.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:45 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Sep 30 21:18:45 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000f.scope: Consumed 2.873s CPU time.
Sep 30 21:18:45 compute-0 systemd-machined[152794]: Machine qemu-7-instance-0000000f terminated.
Sep 30 21:18:46 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [NOTICE]   (222970) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:46 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [NOTICE]   (222970) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:46 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [WARNING]  (222970) : Exiting Master process...
Sep 30 21:18:46 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [ALERT]    (222970) : Current worker (222972) exited with code 143 (Terminated)
Sep 30 21:18:46 compute-0 neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878[222966]: [WARNING]  (222970) : All workers exited. Exiting... (0)
Sep 30 21:18:46 compute-0 systemd[1]: libpod-39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11.scope: Deactivated successfully.
Sep 30 21:18:46 compute-0 podman[223032]: 2025-09-30 21:18:46.100872064 +0000 UTC m=+0.070986083 container died 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.109 2 INFO nova.virt.libvirt.driver [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Instance destroyed successfully.
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.110 2 DEBUG nova.objects.instance [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'resources' on Instance uuid 800f4413-c978-4c4e-97b6-1ea1e45f9f17 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-524b9ff0d2ebef7787c8e47a5caea544112fbe1d200a5a48fb1923136ecbda60-merged.mount: Deactivated successfully.
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.143 2 DEBUG nova.virt.libvirt.vif [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:18:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-943144079',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-943144079',id=15,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:18:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-4e04clu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:18:43Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=800f4413-c978-4c4e-97b6-1ea1e45f9f17,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.143 2 DEBUG nova.network.os_vif_util [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "address": "fa:16:3e:82:04:bd", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6fd9c21-8c", "ovs_interfaceid": "c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.145 2 DEBUG nova.network.os_vif_util [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.146 2 DEBUG os_vif [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.150 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6fd9c21-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:46 compute-0 podman[223032]: 2025-09-30 21:18:46.151420967 +0000 UTC m=+0.121535026 container cleanup 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.158 2 INFO os_vif [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:04:bd,bridge_name='br-int',has_traffic_filtering=True,id=c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc6fd9c21-8c')
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.159 2 INFO nova.virt.libvirt.driver [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Deleting instance files /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17_del
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.160 2 INFO nova.virt.libvirt.driver [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Deletion of /var/lib/nova/instances/800f4413-c978-4c4e-97b6-1ea1e45f9f17_del complete
Sep 30 21:18:46 compute-0 systemd[1]: libpod-conmon-39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11.scope: Deactivated successfully.
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.244 2 INFO nova.compute.manager [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.245 2 DEBUG oslo.service.loopingcall [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.245 2 DEBUG nova.compute.manager [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.246 2 DEBUG nova.network.neutron [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:18:46 compute-0 podman[223076]: 2025-09-30 21:18:46.247209749 +0000 UTC m=+0.056136666 container remove 39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.256 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3a82e551-592b-49ee-97bb-dd088f611809]: (4, ('Tue Sep 30 09:18:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 (39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11)\n39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11\nTue Sep 30 09:18:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 (39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11)\n39a7ab9f04aa478977f0f9b8f45c264365703cad4b43274110359fcc1ffebe11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.259 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4a3940-e8eb-43e9-9456-469bfa06c1ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.261 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap383de3b7-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 kernel: tap383de3b7-a0: left promiscuous mode
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.295 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eece73af-7e65-4a58-b092-892055bca794]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.322 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[abfb67a2-1461-4038-a585-1350c48c3c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.323 2 DEBUG nova.compute.manager [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.324 2 DEBUG oslo_concurrency.lockutils [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.325 2 DEBUG oslo_concurrency.lockutils [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.324 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b45f1ce8-5e54-4e11-a453-e52696c2c215]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.325 2 DEBUG oslo_concurrency.lockutils [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.326 2 DEBUG nova.compute.manager [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.326 2 DEBUG nova.compute.manager [req-bdac2a9b-597c-4f28-90ca-edb6bebead18 req-97ab33be-d0ed-4022-90c0-32d6c3c9b24c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-unplugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.350 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e04b7d33-96ba-4d6a-969c-ded56fb242f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381394, 'reachable_time': 34046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223091, 'error': None, 'target': 'ovnmeta-383de3b7-a202-49cd-b129-b065ac294878', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.354 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-383de3b7-a202-49cd-b129-b065ac294878 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.354 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[0410c1b4-64ca-446a-97c9-d42c4779739d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.356 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d383de3b7\x2da202\x2d49cd\x2db129\x2db065ac294878.mount: Deactivated successfully.
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.358 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 934fff90-5446-41f1-a5ad-d2568cb337b1
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.379 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[30c23d8d-6d7e-42de-b273-6016936fd92d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.425 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[999a9cde-05f1-44e6-94eb-8dc4c4cf23f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.430 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[70fdd88b-4e58-429a-a36c-2389f65309cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.476 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3d341985-bb96-4935-98be-38dcf5de927c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.508 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[00d14b64-40eb-4c24-9c9c-6eacebe24aa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap934fff90-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:37:65'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 40, 'tx_packets': 7, 'rx_bytes': 2176, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 40, 'tx_packets': 7, 'rx_bytes': 2176, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377390, 'reachable_time': 27389, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223098, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.536 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e6fb4a50-0570-4f60-95c6-1f05b92603e1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap934fff90-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377406, 'tstamp': 377406}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223099, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap934fff90-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377410, 'tstamp': 377410}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223099, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.539 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.544 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap934fff90-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.545 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.545 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap934fff90-50, col_values=(('external_ids', {'iface-id': 'b21a7164-770c-4265-ad15-a3e058ec1a56'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.546 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:18:46 compute-0 nova_compute[192810]: 2025-09-30 21:18:46.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.789 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:46.791 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.242 2 DEBUG nova.network.neutron [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.262 2 INFO nova.compute.manager [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Took 2.02 seconds to deallocate network for instance.
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.344 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.344 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.352 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:48 compute-0 podman[223100]: 2025-09-30 21:18:48.38458641 +0000 UTC m=+0.106462603 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.387 2 INFO nova.scheduler.client.report [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Deleted allocations for instance 800f4413-c978-4c4e-97b6-1ea1e45f9f17
Sep 30 21:18:48 compute-0 podman[223101]: 2025-09-30 21:18:48.391601088 +0000 UTC m=+0.110099596 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.466 2 DEBUG nova.compute.manager [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.467 2 DEBUG oslo_concurrency.lockutils [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.467 2 DEBUG oslo_concurrency.lockutils [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.468 2 DEBUG oslo_concurrency.lockutils [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.468 2 DEBUG nova.compute.manager [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] No waiting events found dispatching network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.468 2 WARNING nova.compute.manager [req-0dccb319-d2b6-4053-9e1a-31a1e3709fd4 req-d5c0bda4-39df-40e8-b94f-d670c93af99c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Received unexpected event network-vif-plugged-c6fd9c21-8c59-46d2-bc64-3a01a99ee2a7 for instance with vm_state deleted and task_state None.
Sep 30 21:18:48 compute-0 nova_compute[192810]: 2025-09-30 21:18:48.511 2 DEBUG oslo_concurrency.lockutils [None req-0427f248-ef2e-4721-80df-ef119360d803 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "800f4413-c978-4c4e-97b6-1ea1e45f9f17" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:50.793 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.699 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.703 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.703 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.704 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.704 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.717 2 INFO nova.compute.manager [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Terminating instance
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.731 2 DEBUG nova.compute.manager [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:18:51 compute-0 kernel: tapbb8ecc8e-9c (unregistering): left promiscuous mode
Sep 30 21:18:51 compute-0 NetworkManager[51733]: <info>  [1759267131.7593] device (tapbb8ecc8e-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:18:51 compute-0 ovn_controller[94912]: 2025-09-30T21:18:51Z|00104|binding|INFO|Releasing lport bb8ecc8e-9cf4-4901-9788-83c49356f983 from this chassis (sb_readonly=0)
Sep 30 21:18:51 compute-0 ovn_controller[94912]: 2025-09-30T21:18:51Z|00105|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 down in Southbound
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:51 compute-0 ovn_controller[94912]: 2025-09-30T21:18:51Z|00106|binding|INFO|Removing iface tapbb8ecc8e-9c ovn-installed in OVS
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:51.776 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:51.777 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:51.779 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:51.780 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8a58d32f-a260-4d90-8ff4-92602fea3a6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:51.781 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 namespace which is not needed anymore
Sep 30 21:18:51 compute-0 nova_compute[192810]: 2025-09-30 21:18:51.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:51 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Sep 30 21:18:51 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000b.scope: Consumed 4.317s CPU time.
Sep 30 21:18:51 compute-0 systemd-machined[152794]: Machine qemu-6-instance-0000000b terminated.
Sep 30 21:18:51 compute-0 kernel: tapbb8ecc8e-9c: entered promiscuous mode
Sep 30 21:18:51 compute-0 NetworkManager[51733]: <info>  [1759267131.9634] manager: (tapbb8ecc8e-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Sep 30 21:18:52 compute-0 kernel: tapbb8ecc8e-9c (unregistering): left promiscuous mode
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00107|binding|INFO|Claiming lport bb8ecc8e-9cf4-4901-9788-83c49356f983 for this chassis.
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00108|binding|INFO|bb8ecc8e-9cf4-4901-9788-83c49356f983: Claiming fa:16:3e:45:9d:62 10.100.0.12
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 sshd-session[223141]: Invalid user azure from 45.81.23.80 port 37418
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.021 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:52 compute-0 sshd-session[223141]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:18:52 compute-0 sshd-session[223141]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:18:52 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [NOTICE]   (222571) : haproxy version is 2.8.14-c23fe91
Sep 30 21:18:52 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [NOTICE]   (222571) : path to executable is /usr/sbin/haproxy
Sep 30 21:18:52 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [WARNING]  (222571) : Exiting Master process...
Sep 30 21:18:52 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [ALERT]    (222571) : Current worker (222573) exited with code 143 (Terminated)
Sep 30 21:18:52 compute-0 neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1[222567]: [WARNING]  (222571) : All workers exited. Exiting... (0)
Sep 30 21:18:52 compute-0 systemd[1]: libpod-cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03.scope: Deactivated successfully.
Sep 30 21:18:52 compute-0 podman[223170]: 2025-09-30 21:18:52.03769403 +0000 UTC m=+0.116368716 container died cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00109|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 ovn-installed in OVS
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00110|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 up in Southbound
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00111|binding|INFO|Releasing lport bb8ecc8e-9cf4-4901-9788-83c49356f983 from this chassis (sb_readonly=1)
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00112|binding|INFO|Removing iface tapbb8ecc8e-9c ovn-installed in OVS
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00113|if_status|INFO|Not setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 down as sb is readonly
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00114|binding|INFO|Releasing lport bb8ecc8e-9cf4-4901-9788-83c49356f983 from this chassis (sb_readonly=0)
Sep 30 21:18:52 compute-0 ovn_controller[94912]: 2025-09-30T21:18:52Z|00115|binding|INFO|Setting lport bb8ecc8e-9cf4-4901-9788-83c49356f983 down in Southbound
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.065 2 INFO nova.virt.libvirt.driver [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Instance destroyed successfully.
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.065 2 DEBUG nova.objects.instance [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lazy-loading 'resources' on Instance uuid 7f0e9e16-1467-41e5-b5b0-965591aa014c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.066 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:9d:62 10.100.0.12'], port_security=['fa:16:3e:45:9d:62 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '7f0e9e16-1467-41e5-b5b0-965591aa014c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-934fff90-5446-41f1-a5ad-d2568cb337b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '544a33c53701466d8bf7e8ed34f38dcb', 'neutron:revision_number': '23', 'neutron:security_group_ids': 'ae5806dc-3fbd-4366-84ab-b061f2375093', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c5644fe7-3662-476d-bcfe-5bc86ceef791, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bb8ecc8e-9cf4-4901-9788-83c49356f983) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.083 2 DEBUG nova.virt.libvirt.vif [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:17:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-772926731',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-772926731',id=11,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:17:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='544a33c53701466d8bf7e8ed34f38dcb',ramdisk_id='',reservation_id='r-kiix6ynt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-860972404',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-860972404-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:18:02Z,user_data=None,user_id='981e96ea2bc2419d9a1e57d6aed70304',uuid=7f0e9e16-1467-41e5-b5b0-965591aa014c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.084 2 DEBUG nova.network.os_vif_util [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converting VIF {"id": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "address": "fa:16:3e:45:9d:62", "network": {"id": "934fff90-5446-41f1-a5ad-d2568cb337b1", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-369604927-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "544a33c53701466d8bf7e8ed34f38dcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbb8ecc8e-9c", "ovs_interfaceid": "bb8ecc8e-9cf4-4901-9788-83c49356f983", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:18:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03-userdata-shm.mount: Deactivated successfully.
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.085 2 DEBUG nova.network.os_vif_util [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.085 2 DEBUG os_vif [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-37e1a45db0bb6f0533b9c059f707e6cc1ccffdf73c36f7883fb8b3d608953dd6-merged.mount: Deactivated successfully.
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.087 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbb8ecc8e-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 podman[223170]: 2025-09-30 21:18:52.090226163 +0000 UTC m=+0.168900839 container cleanup cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.092 2 INFO os_vif [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:9d:62,bridge_name='br-int',has_traffic_filtering=True,id=bb8ecc8e-9cf4-4901-9788-83c49356f983,network=Network(934fff90-5446-41f1-a5ad-d2568cb337b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbb8ecc8e-9c')
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.093 2 INFO nova.virt.libvirt.driver [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deleting instance files /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.094 2 INFO nova.virt.libvirt.driver [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deletion of /var/lib/nova/instances/7f0e9e16-1467-41e5-b5b0-965591aa014c_del complete
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.098 2 DEBUG nova.compute.manager [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.098 2 DEBUG oslo_concurrency.lockutils [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.098 2 DEBUG oslo_concurrency.lockutils [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.099 2 DEBUG oslo_concurrency.lockutils [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.099 2 DEBUG nova.compute.manager [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.099 2 DEBUG nova.compute.manager [req-497e7104-fb45-4d57-90bf-fc334c2472ee req-9a752884-ef06-4bad-bcca-dcf2b4cc1598 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-unplugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:18:52 compute-0 systemd[1]: libpod-conmon-cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03.scope: Deactivated successfully.
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.162 2 INFO nova.compute.manager [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.163 2 DEBUG oslo.service.loopingcall [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.164 2 DEBUG nova.compute.manager [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.164 2 DEBUG nova.network.neutron [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:18:52 compute-0 podman[223206]: 2025-09-30 21:18:52.166420738 +0000 UTC m=+0.045245590 container remove cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.174 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d80570e-87ae-4ae3-9b9e-29c3aed6b0f6]: (4, ('Tue Sep 30 09:18:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03)\ncb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03\nTue Sep 30 09:18:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 (cb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03)\ncb2c33ba4a5894e89ac1815456a1c9d7a479275890552e040bbde2bd51d09c03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.175 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9e8493ff-d549-487a-820b-b51fdb685aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.176 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap934fff90-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 kernel: tap934fff90-50: left promiscuous mode
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.192 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e7f677c1-cfc8-40c0-978c-e6c60a03e3fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.222 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[154807ed-4cd7-4fd8-bf2c-e48bf6d88347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bca60737-3d86-4fc3-8650-bc65d5b136c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.246 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[98f1344c-0644-4140-b97d-53888af12c2f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377379, 'reachable_time': 40128, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223221, 'error': None, 'target': 'ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.248 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-934fff90-5446-41f1-a5ad-d2568cb337b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.249 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[77b76848-a4f5-45fe-b848-89e427a9ae74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.249 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.251 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d934fff90\x2d5446\x2d41f1\x2da5ad\x2dd2568cb337b1.mount: Deactivated successfully.
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.252 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a53229e4-3ac8-42b8-8f59-032172c1b4da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.252 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bb8ecc8e-9cf4-4901-9788-83c49356f983 in datapath 934fff90-5446-41f1-a5ad-d2568cb337b1 unbound from our chassis
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.254 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 934fff90-5446-41f1-a5ad-d2568cb337b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:18:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:18:52.254 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fdefa941-ce32-4a89-86a6-5452b7fe110f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.923 2 DEBUG nova.network.neutron [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:18:52 compute-0 nova_compute[192810]: 2025-09-30 21:18:52.939 2 INFO nova.compute.manager [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Took 0.78 seconds to deallocate network for instance.
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.058 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.059 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.127 2 DEBUG nova.compute.provider_tree [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.150 2 DEBUG nova.scheduler.client.report [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.175 2 DEBUG nova.compute.manager [req-f9d9b2e9-ce92-4f1c-b819-752d95e68937 req-733c540b-abc9-4490-a5ba-5e0c501e3c02 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-deleted-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.178 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.210 2 INFO nova.scheduler.client.report [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Deleted allocations for instance 7f0e9e16-1467-41e5-b5b0-965591aa014c
Sep 30 21:18:53 compute-0 nova_compute[192810]: 2025-09-30 21:18:53.309 2 DEBUG oslo_concurrency.lockutils [None req-ad107a4c-3018-4071-9cae-bd86f135e504 981e96ea2bc2419d9a1e57d6aed70304 544a33c53701466d8bf7e8ed34f38dcb - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:53 compute-0 podman[223222]: 2025-09-30 21:18:53.425140562 +0000 UTC m=+0.153826047 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.198 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.199 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.199 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.200 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.200 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.201 2 WARNING nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state deleted and task_state None.
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.201 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.202 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.202 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.203 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.203 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.204 2 WARNING nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state deleted and task_state None.
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.204 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.205 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.205 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.205 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.206 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.206 2 WARNING nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state deleted and task_state None.
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.207 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.207 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.208 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.208 2 DEBUG oslo_concurrency.lockutils [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7f0e9e16-1467-41e5-b5b0-965591aa014c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.209 2 DEBUG nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] No waiting events found dispatching network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:18:54 compute-0 nova_compute[192810]: 2025-09-30 21:18:54.209 2 WARNING nova.compute.manager [req-aaa5ef19-7bb0-4ab1-8ddb-cea5c0227f73 req-b925bf8d-c2db-4f02-a3b5-87c65817d6bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Received unexpected event network-vif-plugged-bb8ecc8e-9cf4-4901-9788-83c49356f983 for instance with vm_state deleted and task_state None.
Sep 30 21:18:54 compute-0 sshd-session[223141]: Failed password for invalid user azure from 45.81.23.80 port 37418 ssh2
Sep 30 21:18:55 compute-0 podman[223248]: 2025-09-30 21:18:55.346848387 +0000 UTC m=+0.072784778 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:18:56 compute-0 nova_compute[192810]: 2025-09-30 21:18:56.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:56 compute-0 sshd-session[223141]: Received disconnect from 45.81.23.80 port 37418:11: Bye Bye [preauth]
Sep 30 21:18:56 compute-0 sshd-session[223141]: Disconnected from invalid user azure 45.81.23.80 port 37418 [preauth]
Sep 30 21:18:56 compute-0 nova_compute[192810]: 2025-09-30 21:18:56.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:57 compute-0 nova_compute[192810]: 2025-09-30 21:18:57.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:18:59 compute-0 podman[223269]: 2025-09-30 21:18:59.344166206 +0000 UTC m=+0.080094935 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:19:01 compute-0 nova_compute[192810]: 2025-09-30 21:19:01.107 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267126.1054516, 800f4413-c978-4c4e-97b6-1ea1e45f9f17 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:01 compute-0 nova_compute[192810]: 2025-09-30 21:19:01.108 2 INFO nova.compute.manager [-] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] VM Stopped (Lifecycle Event)
Sep 30 21:19:01 compute-0 nova_compute[192810]: 2025-09-30 21:19:01.128 2 DEBUG nova.compute.manager [None req-b41fb964-577e-43cb-ba24-e10b6c6d4201 - - - - - -] [instance: 800f4413-c978-4c4e-97b6-1ea1e45f9f17] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:01 compute-0 nova_compute[192810]: 2025-09-30 21:19:01.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:02 compute-0 nova_compute[192810]: 2025-09-30 21:19:02.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:06 compute-0 nova_compute[192810]: 2025-09-30 21:19:06.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:07 compute-0 nova_compute[192810]: 2025-09-30 21:19:07.061 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267132.0592914, 7f0e9e16-1467-41e5-b5b0-965591aa014c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:19:07 compute-0 nova_compute[192810]: 2025-09-30 21:19:07.061 2 INFO nova.compute.manager [-] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] VM Stopped (Lifecycle Event)
Sep 30 21:19:07 compute-0 nova_compute[192810]: 2025-09-30 21:19:07.088 2 DEBUG nova.compute.manager [None req-7b573066-4463-4ad7-8fcf-2895902fa3b7 - - - - - -] [instance: 7f0e9e16-1467-41e5-b5b0-965591aa014c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:19:07 compute-0 nova_compute[192810]: 2025-09-30 21:19:07.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:07 compute-0 podman[223289]: 2025-09-30 21:19:07.347139912 +0000 UTC m=+0.073875905 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:19:07 compute-0 podman[223290]: 2025-09-30 21:19:07.358977503 +0000 UTC m=+0.075188939 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 21:19:10 compute-0 sshd-session[223268]: ssh_dispatch_run_fatal: Connection from 49.64.169.153 port 35251: Connection timed out [preauth]
Sep 30 21:19:11 compute-0 nova_compute[192810]: 2025-09-30 21:19:11.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:12 compute-0 nova_compute[192810]: 2025-09-30 21:19:12.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:14 compute-0 podman[223333]: 2025-09-30 21:19:14.312864496 +0000 UTC m=+0.054624588 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:19:16 compute-0 nova_compute[192810]: 2025-09-30 21:19:16.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:17 compute-0 nova_compute[192810]: 2025-09-30 21:19:17.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:19 compute-0 podman[223353]: 2025-09-30 21:19:19.32735147 +0000 UTC m=+0.063474760 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:19:19 compute-0 podman[223354]: 2025-09-30 21:19:19.328410716 +0000 UTC m=+0.062707161 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:19:21 compute-0 nova_compute[192810]: 2025-09-30 21:19:21.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:22 compute-0 nova_compute[192810]: 2025-09-30 21:19:22.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:24 compute-0 podman[223396]: 2025-09-30 21:19:24.381944885 +0000 UTC m=+0.102780967 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:19:26 compute-0 podman[223422]: 2025-09-30 21:19:26.363646651 +0000 UTC m=+0.093488787 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:19:26 compute-0 nova_compute[192810]: 2025-09-30 21:19:26.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:27 compute-0 nova_compute[192810]: 2025-09-30 21:19:27.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:27 compute-0 nova_compute[192810]: 2025-09-30 21:19:27.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:27 compute-0 nova_compute[192810]: 2025-09-30 21:19:27.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:19:27 compute-0 nova_compute[192810]: 2025-09-30 21:19:27.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:19:29 compute-0 nova_compute[192810]: 2025-09-30 21:19:29.438 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:30 compute-0 podman[223441]: 2025-09-30 21:19:30.347203063 +0000 UTC m=+0.085157519 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:19:31 compute-0 ovn_controller[94912]: 2025-09-30T21:19:31Z|00116|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Sep 30 21:19:31 compute-0 nova_compute[192810]: 2025-09-30 21:19:31.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:32 compute-0 nova_compute[192810]: 2025-09-30 21:19:32.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:32 compute-0 nova_compute[192810]: 2025-09-30 21:19:32.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:32 compute-0 nova_compute[192810]: 2025-09-30 21:19:32.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:19:33 compute-0 nova_compute[192810]: 2025-09-30 21:19:33.800 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:34 compute-0 nova_compute[192810]: 2025-09-30 21:19:34.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:34 compute-0 nova_compute[192810]: 2025-09-30 21:19:34.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:36 compute-0 nova_compute[192810]: 2025-09-30 21:19:36.264 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:36 compute-0 nova_compute[192810]: 2025-09-30 21:19:36.265 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:19:36 compute-0 nova_compute[192810]: 2025-09-30 21:19:36.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:36 compute-0 nova_compute[192810]: 2025-09-30 21:19:36.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:36 compute-0 nova_compute[192810]: 2025-09-30 21:19:36.824 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:37 compute-0 nova_compute[192810]: 2025-09-30 21:19:37.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:38 compute-0 podman[223462]: 2025-09-30 21:19:38.341503694 +0000 UTC m=+0.066234969 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:19:38 compute-0 podman[223461]: 2025-09-30 21:19:38.364007043 +0000 UTC m=+0.087157419 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:19:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:38.722 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:38.724 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:38.724 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.825 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.825 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.825 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.825 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.995 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.996 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5778MB free_disk=73.46318817138672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.996 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:38 compute-0 nova_compute[192810]: 2025-09-30 21:19:38.997 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.070 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.070 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.373 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.395 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.434 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:19:39 compute-0 nova_compute[192810]: 2025-09-30 21:19:39.434 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:40 compute-0 nova_compute[192810]: 2025-09-30 21:19:40.434 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:40 compute-0 nova_compute[192810]: 2025-09-30 21:19:40.435 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:19:40 compute-0 nova_compute[192810]: 2025-09-30 21:19:40.435 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:19:40 compute-0 nova_compute[192810]: 2025-09-30 21:19:40.459 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:19:40 compute-0 nova_compute[192810]: 2025-09-30 21:19:40.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:19:41 compute-0 nova_compute[192810]: 2025-09-30 21:19:41.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:42 compute-0 nova_compute[192810]: 2025-09-30 21:19:42.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:45 compute-0 podman[223505]: 2025-09-30 21:19:45.351837837 +0000 UTC m=+0.076681888 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:19:46 compute-0 nova_compute[192810]: 2025-09-30 21:19:46.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:46 compute-0 sshd-session[223526]: Invalid user minecraft from 45.81.23.80 port 60492
Sep 30 21:19:46 compute-0 sshd-session[223526]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:19:46 compute-0 sshd-session[223526]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:19:47 compute-0 nova_compute[192810]: 2025-09-30 21:19:47.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:49 compute-0 sshd-session[223526]: Failed password for invalid user minecraft from 45.81.23.80 port 60492 ssh2
Sep 30 21:19:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:49.232 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:19:49 compute-0 nova_compute[192810]: 2025-09-30 21:19:49.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:49.234 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:19:50 compute-0 sshd-session[223526]: Received disconnect from 45.81.23.80 port 60492:11: Bye Bye [preauth]
Sep 30 21:19:50 compute-0 sshd-session[223526]: Disconnected from invalid user minecraft 45.81.23.80 port 60492 [preauth]
Sep 30 21:19:50 compute-0 podman[223528]: 2025-09-30 21:19:50.338654046 +0000 UTC m=+0.075954229 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd)
Sep 30 21:19:50 compute-0 podman[223529]: 2025-09-30 21:19:50.360497659 +0000 UTC m=+0.084031360 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:19:51 compute-0 nova_compute[192810]: 2025-09-30 21:19:51.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:52 compute-0 nova_compute[192810]: 2025-09-30 21:19:52.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:52 compute-0 sshd-session[223504]: error: kex_exchange_identification: read: Connection timed out
Sep 30 21:19:52 compute-0 sshd-session[223504]: banner exchange: Connection from 113.240.110.90 port 55758: Connection timed out
Sep 30 21:19:55 compute-0 podman[223570]: 2025-09-30 21:19:55.420708214 +0000 UTC m=+0.145779608 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:19:56 compute-0 nova_compute[192810]: 2025-09-30 21:19:56.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:57 compute-0 nova_compute[192810]: 2025-09-30 21:19:57.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:19:57 compute-0 podman[223597]: 2025-09-30 21:19:57.329110076 +0000 UTC m=+0.067023318 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:19:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:19:58.236 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.399 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "c1a4fc59-617f-432f-9006-e50d45407087" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.399 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.429 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.591 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.592 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.598 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.598 2 INFO nova.compute.claims [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.889 2 DEBUG nova.compute.provider_tree [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.902 2 DEBUG nova.scheduler.client.report [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.947 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:19:59 compute-0 nova_compute[192810]: 2025-09-30 21:19:59.948 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.023 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.024 2 DEBUG nova.network.neutron [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.054 2 INFO nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.074 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.422 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.423 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.423 2 INFO nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Creating image(s)
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.424 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.424 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.424 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.436 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.511 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.512 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.512 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.522 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.614 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.616 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.681 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.683 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.685 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.771 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.773 2 DEBUG nova.virt.disk.api [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Checking if we can resize image /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.774 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.838 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.840 2 DEBUG nova.virt.disk.api [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Cannot resize image /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.840 2 DEBUG nova.objects.instance [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'migration_context' on Instance uuid c1a4fc59-617f-432f-9006-e50d45407087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.860 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.861 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Ensure instance console log exists: /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.862 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.862 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:00 compute-0 nova_compute[192810]: 2025-09-30 21:20:00.863 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:01 compute-0 podman[223632]: 2025-09-30 21:20:01.350268574 +0000 UTC m=+0.079922299 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.405 2 DEBUG nova.network.neutron [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.406 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.409 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.417 2 WARNING nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.422 2 DEBUG nova.virt.libvirt.host [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.423 2 DEBUG nova.virt.libvirt.host [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.427 2 DEBUG nova.virt.libvirt.host [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.428 2 DEBUG nova.virt.libvirt.host [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.431 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.431 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.432 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.433 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.433 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.434 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.434 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.435 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.435 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.436 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.436 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.437 2 DEBUG nova.virt.hardware [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.446 2 DEBUG nova.objects.instance [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'pci_devices' on Instance uuid c1a4fc59-617f-432f-9006-e50d45407087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.474 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <uuid>c1a4fc59-617f-432f-9006-e50d45407087</uuid>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <name>instance-00000015</name>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:name>tempest-LiveMigrationNegativeTest-server-1477542592</nova:name>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:20:01</nova:creationTime>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:user uuid="5343f32daec8477c9c7ddb1131d56512">tempest-LiveMigrationNegativeTest-2017460260-project-member</nova:user>
Sep 30 21:20:01 compute-0 nova_compute[192810]:         <nova:project uuid="0dde308687c844d78abe51fa41f330be">tempest-LiveMigrationNegativeTest-2017460260</nova:project>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <system>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="serial">c1a4fc59-617f-432f-9006-e50d45407087</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="uuid">c1a4fc59-617f-432f-9006-e50d45407087</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </system>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <os>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </os>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <features>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </features>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.config"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/console.log" append="off"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <video>
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </video>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:20:01 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:20:01 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:20:01 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:20:01 compute-0 nova_compute[192810]: </domain>
Sep 30 21:20:01 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.606 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.607 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:01 compute-0 nova_compute[192810]: 2025-09-30 21:20:01.608 2 INFO nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Using config drive
Sep 30 21:20:02 compute-0 nova_compute[192810]: 2025-09-30 21:20:02.025 2 INFO nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Creating config drive at /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.config
Sep 30 21:20:02 compute-0 nova_compute[192810]: 2025-09-30 21:20:02.035 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe6zu8z5f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:02 compute-0 nova_compute[192810]: 2025-09-30 21:20:02.180 2 DEBUG oslo_concurrency.processutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe6zu8z5f" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:02 compute-0 nova_compute[192810]: 2025-09-30 21:20:02.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:02 compute-0 systemd-machined[152794]: New machine qemu-8-instance-00000015.
Sep 30 21:20:02 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000015.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.274 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267203.2739897, c1a4fc59-617f-432f-9006-e50d45407087 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.276 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] VM Resumed (Lifecycle Event)
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.280 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.281 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.284 2 INFO nova.virt.libvirt.driver [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance spawned successfully.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.285 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.302 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.306 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.318 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.318 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.319 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.319 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.320 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.321 2 DEBUG nova.virt.libvirt.driver [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.327 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.327 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267203.275084, c1a4fc59-617f-432f-9006-e50d45407087 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.328 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] VM Started (Lifecycle Event)
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.370 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.374 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.399 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.451 2 INFO nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Took 3.03 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.452 2 DEBUG nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.573 2 INFO nova.compute.manager [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Took 4.03 seconds to build instance.
Sep 30 21:20:03 compute-0 nova_compute[192810]: 2025-09-30 21:20:03.609 2 DEBUG oslo_concurrency.lockutils [None req-b349c04c-4af8-4f67-a69c-60ee118a9a66 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:06 compute-0 nova_compute[192810]: 2025-09-30 21:20:06.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:07 compute-0 nova_compute[192810]: 2025-09-30 21:20:07.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.276 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "6a3099db-5207-4a33-9efb-18c956930b55" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.277 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.296 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:20:09 compute-0 podman[223682]: 2025-09-30 21:20:09.325965063 +0000 UTC m=+0.064634049 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc.)
Sep 30 21:20:09 compute-0 podman[223681]: 2025-09-30 21:20:09.326431144 +0000 UTC m=+0.065348706 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.507 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.508 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.517 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.517 2 INFO nova.compute.claims [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.707 2 DEBUG nova.compute.provider_tree [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.723 2 DEBUG nova.scheduler.client.report [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.755 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.755 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.827 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.828 2 DEBUG nova.network.neutron [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.844 2 INFO nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.863 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.995 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.996 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.997 2 INFO nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Creating image(s)
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.997 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.998 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:09 compute-0 nova_compute[192810]: 2025-09-30 21:20:09.998 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.011 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.066 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.067 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.068 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.078 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.130 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.131 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.163 2 DEBUG nova.network.neutron [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.164 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.169 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.170 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.171 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.254 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.256 2 DEBUG nova.virt.disk.api [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Checking if we can resize image /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.257 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.316 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.317 2 DEBUG nova.virt.disk.api [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Cannot resize image /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.317 2 DEBUG nova.objects.instance [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'migration_context' on Instance uuid 6a3099db-5207-4a33-9efb-18c956930b55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.343 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.344 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Ensure instance console log exists: /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.344 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.344 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.345 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.346 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.350 2 WARNING nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.354 2 DEBUG nova.virt.libvirt.host [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.354 2 DEBUG nova.virt.libvirt.host [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.358 2 DEBUG nova.virt.libvirt.host [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.358 2 DEBUG nova.virt.libvirt.host [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.359 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.360 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.360 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.360 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.361 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.361 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.361 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.361 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.362 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.362 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.362 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.362 2 DEBUG nova.virt.hardware [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.367 2 DEBUG nova.objects.instance [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a3099db-5207-4a33-9efb-18c956930b55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.380 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <uuid>6a3099db-5207-4a33-9efb-18c956930b55</uuid>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <name>instance-00000018</name>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:name>tempest-LiveMigrationNegativeTest-server-324278117</nova:name>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:20:10</nova:creationTime>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:user uuid="5343f32daec8477c9c7ddb1131d56512">tempest-LiveMigrationNegativeTest-2017460260-project-member</nova:user>
Sep 30 21:20:10 compute-0 nova_compute[192810]:         <nova:project uuid="0dde308687c844d78abe51fa41f330be">tempest-LiveMigrationNegativeTest-2017460260</nova:project>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <system>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="serial">6a3099db-5207-4a33-9efb-18c956930b55</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="uuid">6a3099db-5207-4a33-9efb-18c956930b55</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </system>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <os>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </os>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <features>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </features>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.config"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/console.log" append="off"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <video>
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </video>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:20:10 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:20:10 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:20:10 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:20:10 compute-0 nova_compute[192810]: </domain>
Sep 30 21:20:10 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.443 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.444 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.444 2 INFO nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Using config drive
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.802 2 INFO nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Creating config drive at /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.config
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.807 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4n33tf1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:10 compute-0 nova_compute[192810]: 2025-09-30 21:20:10.932 2 DEBUG oslo_concurrency.processutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb4n33tf1" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:11 compute-0 systemd-machined[152794]: New machine qemu-9-instance-00000018.
Sep 30 21:20:11 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000018.
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.851 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267211.8472657, 6a3099db-5207-4a33-9efb-18c956930b55 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.851 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] VM Resumed (Lifecycle Event)
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.853 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.853 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.857 2 INFO nova.virt.libvirt.driver [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance spawned successfully.
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.858 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.883 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.884 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.884 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.885 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.885 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.885 2 DEBUG nova.virt.libvirt.driver [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.935 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:11 compute-0 nova_compute[192810]: 2025-09-30 21:20:11.938 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.004 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.004 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267211.84739, 6a3099db-5207-4a33-9efb-18c956930b55 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.004 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] VM Started (Lifecycle Event)
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.035 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.039 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.074 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.107 2 INFO nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Took 2.11 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.108 2 DEBUG nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.231 2 INFO nova.compute.manager [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Took 2.82 seconds to build instance.
Sep 30 21:20:12 compute-0 nova_compute[192810]: 2025-09-30 21:20:12.253 2 DEBUG oslo_concurrency.lockutils [None req-ce4691ea-cde4-4cf3-8464-df6144114cdb 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.858 2 DEBUG nova.objects.instance [None req-bfbb0c4e-bac1-4055-b5d6-30d309086b1e f2a9fc996c424ef1ac977cfd3cf22b0b 1737dbb0bbdb46a386e172f785d2a0db - - default default] Lazy-loading 'pci_devices' on Instance uuid 6a3099db-5207-4a33-9efb-18c956930b55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.877 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267213.8776689, 6a3099db-5207-4a33-9efb-18c956930b55 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.878 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] VM Paused (Lifecycle Event)
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.898 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.903 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:13 compute-0 nova_compute[192810]: 2025-09-30 21:20:13.936 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:20:14 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Deactivated successfully.
Sep 30 21:20:14 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Consumed 2.886s CPU time.
Sep 30 21:20:14 compute-0 systemd-machined[152794]: Machine qemu-9-instance-00000018 terminated.
Sep 30 21:20:14 compute-0 nova_compute[192810]: 2025-09-30 21:20:14.576 2 DEBUG nova.compute.manager [None req-bfbb0c4e-bac1-4055-b5d6-30d309086b1e f2a9fc996c424ef1ac977cfd3cf22b0b 1737dbb0bbdb46a386e172f785d2a0db - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:16 compute-0 podman[223788]: 2025-09-30 21:20:16.354619513 +0000 UTC m=+0.081434337 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:20:16 compute-0 nova_compute[192810]: 2025-09-30 21:20:16.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.687 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "6a3099db-5207-4a33-9efb-18c956930b55" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.688 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.688 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "6a3099db-5207-4a33-9efb-18c956930b55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.689 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.689 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.706 2 INFO nova.compute.manager [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Terminating instance
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.724 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "refresh_cache-6a3099db-5207-4a33-9efb-18c956930b55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.724 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquired lock "refresh_cache-6a3099db-5207-4a33-9efb-18c956930b55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.725 2 DEBUG nova.network.neutron [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:17 compute-0 nova_compute[192810]: 2025-09-30 21:20:17.968 2 DEBUG nova.network.neutron [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.136 2 DEBUG nova.network.neutron [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.171 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Releasing lock "refresh_cache-6a3099db-5207-4a33-9efb-18c956930b55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.172 2 DEBUG nova.compute.manager [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.179 2 INFO nova.virt.libvirt.driver [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance destroyed successfully.
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.179 2 DEBUG nova.objects.instance [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'resources' on Instance uuid 6a3099db-5207-4a33-9efb-18c956930b55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.198 2 INFO nova.virt.libvirt.driver [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Deleting instance files /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55_del
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.198 2 INFO nova.virt.libvirt.driver [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Deletion of /var/lib/nova/instances/6a3099db-5207-4a33-9efb-18c956930b55_del complete
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.309 2 INFO nova.compute.manager [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Took 0.14 seconds to destroy the instance on the hypervisor.
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.309 2 DEBUG oslo.service.loopingcall [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.309 2 DEBUG nova.compute.manager [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.309 2 DEBUG nova.network.neutron [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.642 2 DEBUG nova.network.neutron [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.662 2 DEBUG nova.network.neutron [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.678 2 INFO nova.compute.manager [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Took 0.37 seconds to deallocate network for instance.
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.778 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.779 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.886 2 DEBUG nova.compute.provider_tree [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.912 2 DEBUG nova.scheduler.client.report [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:19 compute-0 nova_compute[192810]: 2025-09-30 21:20:19.956 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:20 compute-0 nova_compute[192810]: 2025-09-30 21:20:20.010 2 INFO nova.scheduler.client.report [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Deleted allocations for instance 6a3099db-5207-4a33-9efb-18c956930b55
Sep 30 21:20:20 compute-0 nova_compute[192810]: 2025-09-30 21:20:20.128 2 DEBUG oslo_concurrency.lockutils [None req-169e44f0-9e2f-4cf9-ae5c-b977d83af7b1 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "6a3099db-5207-4a33-9efb-18c956930b55" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:21 compute-0 podman[223809]: 2025-09-30 21:20:21.32690832 +0000 UTC m=+0.063335877 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:20:21 compute-0 podman[223808]: 2025-09-30 21:20:21.359506411 +0000 UTC m=+0.091291942 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.665 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "c1a4fc59-617f-432f-9006-e50d45407087" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.665 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.665 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "c1a4fc59-617f-432f-9006-e50d45407087-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.666 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.666 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.684 2 INFO nova.compute.manager [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Terminating instance
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.702 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "refresh_cache-c1a4fc59-617f-432f-9006-e50d45407087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.702 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquired lock "refresh_cache-c1a4fc59-617f-432f-9006-e50d45407087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:21 compute-0 nova_compute[192810]: 2025-09-30 21:20:21.702 2 DEBUG nova.network.neutron [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.108 2 DEBUG nova.network.neutron [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.537 2 DEBUG nova.network.neutron [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.564 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Releasing lock "refresh_cache-c1a4fc59-617f-432f-9006-e50d45407087" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.564 2 DEBUG nova.compute.manager [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:20:22 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Deactivated successfully.
Sep 30 21:20:22 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Consumed 12.650s CPU time.
Sep 30 21:20:22 compute-0 systemd-machined[152794]: Machine qemu-8-instance-00000015 terminated.
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.811 2 INFO nova.virt.libvirt.driver [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance destroyed successfully.
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.812 2 DEBUG nova.objects.instance [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lazy-loading 'resources' on Instance uuid c1a4fc59-617f-432f-9006-e50d45407087 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.828 2 INFO nova.virt.libvirt.driver [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Deleting instance files /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087_del
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.829 2 INFO nova.virt.libvirt.driver [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Deletion of /var/lib/nova/instances/c1a4fc59-617f-432f-9006-e50d45407087_del complete
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.937 2 INFO nova.compute.manager [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.937 2 DEBUG oslo.service.loopingcall [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.938 2 DEBUG nova.compute.manager [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:20:22 compute-0 nova_compute[192810]: 2025-09-30 21:20:22.938 2 DEBUG nova.network.neutron [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.725 2 DEBUG nova.network.neutron [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.748 2 DEBUG nova.network.neutron [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.764 2 INFO nova.compute.manager [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Took 0.83 seconds to deallocate network for instance.
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.856 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.857 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.925 2 DEBUG nova.compute.provider_tree [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.942 2 DEBUG nova.scheduler.client.report [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:23 compute-0 nova_compute[192810]: 2025-09-30 21:20:23.975 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:24 compute-0 nova_compute[192810]: 2025-09-30 21:20:24.034 2 INFO nova.scheduler.client.report [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Deleted allocations for instance c1a4fc59-617f-432f-9006-e50d45407087
Sep 30 21:20:24 compute-0 nova_compute[192810]: 2025-09-30 21:20:24.130 2 DEBUG oslo_concurrency.lockutils [None req-8a3d686d-9cdc-499b-944f-b3ff49014873 5343f32daec8477c9c7ddb1131d56512 0dde308687c844d78abe51fa41f330be - - default default] Lock "c1a4fc59-617f-432f-9006-e50d45407087" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.274 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.275 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.294 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:20:26 compute-0 podman[223861]: 2025-09-30 21:20:26.365082186 +0000 UTC m=+0.099080936 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.590 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.590 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.596 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.596 2 INFO nova.compute.claims [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.732 2 DEBUG nova.compute.provider_tree [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.746 2 DEBUG nova.scheduler.client.report [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.763 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.764 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.843 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.844 2 DEBUG nova.network.neutron [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.867 2 INFO nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:26 compute-0 nova_compute[192810]: 2025-09-30 21:20:26.884 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.036 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.037 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.038 2 INFO nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Creating image(s)
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.038 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.038 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.039 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.050 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.130 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.131 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.132 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.148 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.215 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.216 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.267 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.268 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.268 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.324 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.325 2 DEBUG nova.virt.disk.api [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Checking if we can resize image /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.325 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.376 2 DEBUG nova.network.neutron [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.377 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.383 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.384 2 DEBUG nova.virt.disk.api [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Cannot resize image /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.384 2 DEBUG nova.objects.instance [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lazy-loading 'migration_context' on Instance uuid 383bdd8f-b596-427e-b6ba-f3251a14c6dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.418 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.418 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Ensure instance console log exists: /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.419 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.419 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.420 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.421 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.426 2 WARNING nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.433 2 DEBUG nova.virt.libvirt.host [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.433 2 DEBUG nova.virt.libvirt.host [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.436 2 DEBUG nova.virt.libvirt.host [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.437 2 DEBUG nova.virt.libvirt.host [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.438 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.438 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.439 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.439 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.439 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.439 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.440 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.440 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.440 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.440 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.441 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.441 2 DEBUG nova.virt.hardware [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.445 2 DEBUG nova.objects.instance [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lazy-loading 'pci_devices' on Instance uuid 383bdd8f-b596-427e-b6ba-f3251a14c6dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.462 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <uuid>383bdd8f-b596-427e-b6ba-f3251a14c6dd</uuid>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <name>instance-00000019</name>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerExternalEventsTest-server-1373497962</nova:name>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:20:27</nova:creationTime>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:user uuid="1c7593279fc143e9a5071dea92886ec0">tempest-ServerExternalEventsTest-1772485432-project-member</nova:user>
Sep 30 21:20:27 compute-0 nova_compute[192810]:         <nova:project uuid="b37f4464db2b4515b19f044946e1878b">tempest-ServerExternalEventsTest-1772485432</nova:project>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <system>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="serial">383bdd8f-b596-427e-b6ba-f3251a14c6dd</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="uuid">383bdd8f-b596-427e-b6ba-f3251a14c6dd</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </system>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <os>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </os>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <features>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </features>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.config"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/console.log" append="off"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <video>
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </video>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:20:27 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:20:27 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:20:27 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:20:27 compute-0 nova_compute[192810]: </domain>
Sep 30 21:20:27 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.526 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.527 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.527 2 INFO nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Using config drive
Sep 30 21:20:27 compute-0 podman[223903]: 2025-09-30 21:20:27.572300396 +0000 UTC m=+0.070317620 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.715 2 INFO nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Creating config drive at /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.config
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.720 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplxb802zw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:27 compute-0 nova_compute[192810]: 2025-09-30 21:20:27.855 2 DEBUG oslo_concurrency.processutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplxb802zw" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:27 compute-0 systemd-machined[152794]: New machine qemu-10-instance-00000019.
Sep 30 21:20:27 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000019.
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.817 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267228.8172774, 383bdd8f-b596-427e-b6ba-f3251a14c6dd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.818 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] VM Resumed (Lifecycle Event)
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.821 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.822 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.827 2 INFO nova.virt.libvirt.driver [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance spawned successfully.
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.828 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.852 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.856 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.868 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.868 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.869 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.870 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.870 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.871 2 DEBUG nova.virt.libvirt.driver [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.880 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.881 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267228.8208323, 383bdd8f-b596-427e-b6ba-f3251a14c6dd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.881 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] VM Started (Lifecycle Event)
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.906 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.911 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.936 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.950 2 INFO nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Took 1.91 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:28 compute-0 nova_compute[192810]: 2025-09-30 21:20:28.950 2 DEBUG nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:29 compute-0 nova_compute[192810]: 2025-09-30 21:20:29.024 2 INFO nova.compute.manager [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Took 2.48 seconds to build instance.
Sep 30 21:20:29 compute-0 nova_compute[192810]: 2025-09-30 21:20:29.038 2 DEBUG oslo_concurrency.lockutils [None req-90efafeb-33ae-4e7f-bead-002a5f94cf5a 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:29 compute-0 nova_compute[192810]: 2025-09-30 21:20:29.577 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267214.5764709, 6a3099db-5207-4a33-9efb-18c956930b55 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:29 compute-0 nova_compute[192810]: 2025-09-30 21:20:29.578 2 INFO nova.compute.manager [-] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] VM Stopped (Lifecycle Event)
Sep 30 21:20:29 compute-0 nova_compute[192810]: 2025-09-30 21:20:29.598 2 DEBUG nova.compute.manager [None req-d529c6f5-7eaf-4982-9d85-5ac6a92ffc9a - - - - - -] [instance: 6a3099db-5207-4a33-9efb-18c956930b55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.112 2 DEBUG nova.compute.manager [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Received event network-changed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.112 2 DEBUG nova.compute.manager [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Refreshing instance network info cache due to event network-changed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.112 2 DEBUG oslo_concurrency.lockutils [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] Acquiring lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.112 2 DEBUG oslo_concurrency.lockutils [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] Acquired lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.113 2 DEBUG nova.network.neutron [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.355 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.355 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.355 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.356 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.356 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.367 2 INFO nova.compute.manager [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Terminating instance
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.379 2 DEBUG nova.network.neutron [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.383 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.925 2 DEBUG nova.network.neutron [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.942 2 DEBUG oslo_concurrency.lockutils [None req-d44fae96-db73-48e6-b998-613f438ea3e3 1ae3ab2de4774d5a9d82ca1f045073ea 3575c7c84bc840508c25b5099bbdd018 - - default default] Releasing lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.944 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquired lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:31 compute-0 nova_compute[192810]: 2025-09-30 21:20:31.945 2 DEBUG nova.network.neutron [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:32 compute-0 nova_compute[192810]: 2025-09-30 21:20:32.057 2 DEBUG nova.network.neutron [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:32 compute-0 nova_compute[192810]: 2025-09-30 21:20:32.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:32 compute-0 podman[223950]: 2025-09-30 21:20:32.373347064 +0000 UTC m=+0.088856341 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.225 2 DEBUG nova.network.neutron [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.246 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Releasing lock "refresh_cache-383bdd8f-b596-427e-b6ba-f3251a14c6dd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.247 2 DEBUG nova.compute.manager [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:20:33 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000019.scope: Deactivated successfully.
Sep 30 21:20:33 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000019.scope: Consumed 5.234s CPU time.
Sep 30 21:20:33 compute-0 systemd-machined[152794]: Machine qemu-10-instance-00000019 terminated.
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.517 2 INFO nova.virt.libvirt.driver [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance destroyed successfully.
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.518 2 DEBUG nova.objects.instance [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lazy-loading 'resources' on Instance uuid 383bdd8f-b596-427e-b6ba-f3251a14c6dd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.531 2 INFO nova.virt.libvirt.driver [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Deleting instance files /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd_del
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.532 2 INFO nova.virt.libvirt.driver [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Deletion of /var/lib/nova/instances/383bdd8f-b596-427e-b6ba-f3251a14c6dd_del complete
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.611 2 INFO nova.compute.manager [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.612 2 DEBUG oslo.service.loopingcall [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.613 2 DEBUG nova.compute.manager [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:20:33 compute-0 nova_compute[192810]: 2025-09-30 21:20:33.613 2 DEBUG nova.network.neutron [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.233 2 DEBUG nova.network.neutron [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.250 2 DEBUG nova.network.neutron [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.272 2 INFO nova.compute.manager [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Took 0.66 seconds to deallocate network for instance.
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.376 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.376 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.475 2 DEBUG nova.compute.provider_tree [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.496 2 DEBUG nova.scheduler.client.report [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.528 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.573 2 INFO nova.scheduler.client.report [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Deleted allocations for instance 383bdd8f-b596-427e-b6ba-f3251a14c6dd
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.660 2 DEBUG oslo_concurrency.lockutils [None req-da80e7f1-e30f-4d22-9e89-e303f1ed7a45 1c7593279fc143e9a5071dea92886ec0 b37f4464db2b4515b19f044946e1878b - - default default] Lock "383bdd8f-b596-427e-b6ba-f3251a14c6dd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:34 compute-0 nova_compute[192810]: 2025-09-30 21:20:34.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.232 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "56118128-4b8a-4d28-aeb3-585059dc4de8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.233 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.248 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.336 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.336 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.342 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.342 2 INFO nova.compute.claims [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.485 2 DEBUG nova.compute.provider_tree [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.509 2 DEBUG nova.scheduler.client.report [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.537 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.538 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.606 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.607 2 DEBUG nova.network.neutron [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.627 2 INFO nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.652 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.806 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.808 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.809 2 INFO nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Creating image(s)
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.810 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.810 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.811 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.835 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.937 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.938 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.939 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:35 compute-0 nova_compute[192810]: 2025-09-30 21:20:35.962 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.021 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.023 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.078 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.080 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.081 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.158 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.160 2 DEBUG nova.virt.disk.api [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Checking if we can resize image /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.160 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.185 2 DEBUG nova.network.neutron [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.186 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.223 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.224 2 DEBUG nova.virt.disk.api [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Cannot resize image /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.225 2 DEBUG nova.objects.instance [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 56118128-4b8a-4d28-aeb3-585059dc4de8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.236 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.237 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Ensure instance console log exists: /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.237 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.237 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.238 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.239 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.243 2 WARNING nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.248 2 DEBUG nova.virt.libvirt.host [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.249 2 DEBUG nova.virt.libvirt.host [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.256 2 DEBUG nova.virt.libvirt.host [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.257 2 DEBUG nova.virt.libvirt.host [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.258 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.258 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.259 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.259 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.259 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.260 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.260 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.260 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.260 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.261 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.261 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.261 2 DEBUG nova.virt.hardware [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.265 2 DEBUG nova.objects.instance [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_devices' on Instance uuid 56118128-4b8a-4d28-aeb3-585059dc4de8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.278 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <uuid>56118128-4b8a-4d28-aeb3-585059dc4de8</uuid>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <name>instance-0000001a</name>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:name>tempest-MigrationsAdminTest-server-1443112583</nova:name>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:20:36</nova:creationTime>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:20:36 compute-0 nova_compute[192810]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <system>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="serial">56118128-4b8a-4d28-aeb3-585059dc4de8</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="uuid">56118128-4b8a-4d28-aeb3-585059dc4de8</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </system>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <os>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </os>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <features>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </features>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/console.log" append="off"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <video>
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </video>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:20:36 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:20:36 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:20:36 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:20:36 compute-0 nova_compute[192810]: </domain>
Sep 30 21:20:36 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.332 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.333 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.333 2 INFO nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Using config drive
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:36 compute-0 nova_compute[192810]: 2025-09-30 21:20:36.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.237 2 INFO nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Creating config drive at /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.247 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcn06k20t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.393 2 DEBUG oslo_concurrency.processutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcn06k20t" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:37 compute-0 systemd-machined[152794]: New machine qemu-11-instance-0000001a.
Sep 30 21:20:37 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000001a.
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.810 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267222.8093598, c1a4fc59-617f-432f-9006-e50d45407087 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.811 2 INFO nova.compute.manager [-] [instance: c1a4fc59-617f-432f-9006-e50d45407087] VM Stopped (Lifecycle Event)
Sep 30 21:20:37 compute-0 nova_compute[192810]: 2025-09-30 21:20:37.842 2 DEBUG nova.compute.manager [None req-ecc3f616-9d5c-4be9-aaaf-1793862aac73 - - - - - -] [instance: c1a4fc59-617f-432f-9006-e50d45407087] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.721 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267238.720658, 56118128-4b8a-4d28-aeb3-585059dc4de8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:38.724 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.723 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] VM Resumed (Lifecycle Event)
Sep 30 21:20:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:38.726 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:38.726 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.726 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.727 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.731 2 INFO nova.virt.libvirt.driver [-] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance spawned successfully.
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.731 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.756 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.760 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.763 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.763 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.764 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.764 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.764 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.765 2 DEBUG nova.virt.libvirt.driver [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.797 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.797 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267238.7256992, 56118128-4b8a-4d28-aeb3-585059dc4de8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.797 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] VM Started (Lifecycle Event)
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.850 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.855 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.883 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.888 2 INFO nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Took 3.08 seconds to spawn the instance on the hypervisor.
Sep 30 21:20:38 compute-0 nova_compute[192810]: 2025-09-30 21:20:38.889 2 DEBUG nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:39 compute-0 nova_compute[192810]: 2025-09-30 21:20:39.136 2 INFO nova.compute.manager [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Took 3.83 seconds to build instance.
Sep 30 21:20:39 compute-0 nova_compute[192810]: 2025-09-30 21:20:39.177 2 DEBUG oslo_concurrency.lockutils [None req-95b86367-9b4a-478f-a6a6-7bdb6ba9c129 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:40 compute-0 podman[224025]: 2025-09-30 21:20:40.35248049 +0000 UTC m=+0.080975606 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm)
Sep 30 21:20:40 compute-0 podman[224024]: 2025-09-30 21:20:40.37742189 +0000 UTC m=+0.102178683 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.819 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.820 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.820 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.820 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.895 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.990 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:40 compute-0 nova_compute[192810]: 2025-09-30 21:20:40.991 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.062 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.188 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.190 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5573MB free_disk=73.46239852905273GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.190 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.190 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.255 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 56118128-4b8a-4d28-aeb3-585059dc4de8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.255 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.255 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.290 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:20:41 compute-0 sshd-session[224067]: Invalid user icp from 45.81.23.80 port 55336
Sep 30 21:20:41 compute-0 sshd-session[224067]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:20:41 compute-0 sshd-session[224067]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.313 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.334 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.335 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:20:41 compute-0 nova_compute[192810]: 2025-09-30 21:20:41.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.334 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.335 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.336 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.574 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.575 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.575 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.575 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 56118128-4b8a-4d28-aeb3-585059dc4de8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:20:42 compute-0 nova_compute[192810]: 2025-09-30 21:20:42.891 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:43 compute-0 nova_compute[192810]: 2025-09-30 21:20:43.306 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:43 compute-0 nova_compute[192810]: 2025-09-30 21:20:43.327 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:43 compute-0 nova_compute[192810]: 2025-09-30 21:20:43.327 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:20:43 compute-0 sshd-session[224067]: Failed password for invalid user icp from 45.81.23.80 port 55336 ssh2
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.908 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'name': 'tempest-MigrationsAdminTest-server-1443112583', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'hostId': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.910 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.910 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>]
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.937 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.latency volume: 397931313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.938 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.latency volume: 2540753 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a5836e8-eeed-4ecf-a044-b59513353b5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 397931313, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.915979', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52e785da-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '7c97192a3cfadd91cc5eecfa766cf6c8a4daaec340205b79bee5fa2c1a6a70ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2540753, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.915979', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52e79214-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'a98b5e2459eb74a8d21a7a4a6353acd497947143a9a8642f471fb219fb4da1f3'}]}, 'timestamp': '2025-09-30 21:20:43.939189', '_unique_id': '4a918448756e422c94287448c59d426b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.940 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.941 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.941 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcdafcd5-00cc-4258-abec-6f06fac17799', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.941051', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52e7e426-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '35a62ae858f30f673551a96a587fc8e0f3bdad98c95c274bb0f7ec7395a2d929'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.941051', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52e7ebf6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '6563dabd3a604c93d87af9f07ab549affb37392b7a9143a93312437bea81c132'}]}, 'timestamp': '2025-09-30 21:20:43.941472', '_unique_id': 'c8ee256e4eab4e20828f386c3b35419f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.946 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.946 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>]
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.947 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.947 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.964 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/cpu volume: 5080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63784a5f-128c-4cdb-b36f-86e9549357be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 5080000000, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'timestamp': '2025-09-30T21:20:43.947580', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '52eb783e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.651614922, 'message_signature': '38006dfa529e0f5296c46d61fc7649526a27abb18e1b4f9f0a617ea448ab462e'}]}, 'timestamp': '2025-09-30 21:20:43.965028', '_unique_id': 'c9532c29e8f8487cb52d4259a01dba29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.965 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.966 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.968 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.977 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.978 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'add94f2e-f37e-4d8b-8da3-fd7aec92cedc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.968870', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52ed925e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': '914bdf3b14f2c3cb65f7fdc59cc65f5cb9b21e30647db12ab9183fe1cc3ad10f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.968870', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52ed9ea2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': 'a5c7159c7b453d590497c4da4495690885fddefdb49d0717553dc59cb9821e91'}]}, 'timestamp': '2025-09-30 21:20:43.978909', '_unique_id': '18c2bb6d8ed9458f8af13661a91542e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.982 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.983 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed5dd117-2d9c-4bb8-a258-fcea73091635', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.982671', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52ee43ac-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '2c8ecddaef9202f76523138830a6bafc5407f72c3fd0df9214391f1fd97acf71'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.982671', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52ee519e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'd051eb8737441b4d1caae2745feb35f246bd8cefe368afe9669c71ab451bba34'}]}, 'timestamp': '2025-09-30 21:20:43.983523', '_unique_id': '55474858520b42678eec9402f5b78d57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.984 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.992 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.992 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a025d6f1-4e3b-4894-9d1f-5361228cad8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.992251', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52efb782-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'b3a5133753ba4a75711a20f572c395fb81f182d8a94fc7bcca834df54cc76b5e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.992251', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52efc1e6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '07f107b1e9629ff5b08a4c15e3d6113ce6435f95b3570f9ca2e095181c7fcbf0'}]}, 'timestamp': '2025-09-30 21:20:43.992855', '_unique_id': '88f6422642564e4e840c53ddb31bce4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>]
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.994 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78107054-4ae1-4c99-9d01-d08ae42fd079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.994616', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52f01042-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': '012b4a4e6a66418a9727a204381d79116eca4c8604ebb5cc502fe77fb2858d7d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.994616', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52f0181c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': 'cc78c95c4a1c819ff9a945aaa59e615382e1add1cbc903e95e8b121f00f84777'}]}, 'timestamp': '2025-09-30 21:20:43.995029', '_unique_id': 'c29556c4da904390bb1411fe229d88cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:43.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.001 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.001 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '338d8406-b4cb-44d1-a5b1-6b30363d47a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:43.998836', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52f10c7c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': '22962a81572eed1b7486ae31a89375d7ba302e87648f309b263728fea6e2d4a0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:43.998836', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52f1153c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.65637138, 'message_signature': 'be325226e26086be19cdc1895e1cf8d4e088daa3e04f218461949095d4cb82b3'}]}, 'timestamp': '2025-09-30 21:20:44.001507', '_unique_id': 'f5881b21a252444c8d73d36cc11159c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-1443112583>]
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.002 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.003 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.003 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 56118128-4b8a-4d28-aeb3-585059dc4de8: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.003 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.003 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20f457c8-45c1-481f-88d5-534b6f48d6f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:44.003224', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52f16014-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'e034e5404d716023e1e6176ca35dfe0119eb16138842cbdd54d653797f4b2815'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:44.003224', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52f1679e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': '9b80be410180932083660e5b811d664f26bb69ac77596d831433ba97d780ce40'}]}, 'timestamp': '2025-09-30 21:20:44.003632', '_unique_id': 'c875fbb645fe492388a67a8a4a42c8b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.004 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.007 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.009 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.010 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.010 12 DEBUG ceilometer.compute.pollsters [-] 56118128-4b8a-4d28-aeb3-585059dc4de8/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '817f9b0a-cd4d-479e-aa10-a30cd3b4f108', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-vda', 'timestamp': '2025-09-30T21:20:44.009982', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '52f268ce-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'b84f185b95e969fef30af6fa5a8de0014842060943743c28a06979a82b32aefa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'user_name': None, 'project_id': '7c15359849554c2382315de9f52125af', 'project_name': None, 'resource_id': '56118128-4b8a-4d28-aeb3-585059dc4de8-sda', 'timestamp': '2025-09-30T21:20:44.009982', 'resource_metadata': {'display_name': 'tempest-MigrationsAdminTest-server-1443112583', 'name': 'instance-0000001a', 'instance_id': '56118128-4b8a-4d28-aeb3-585059dc4de8', 'instance_type': 'm1.nano', 'host': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '52f2727e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 3939.603554636, 'message_signature': 'faa1e6ca96d8b5df20e4fbc5d5d20f7cc156a76a7c14f4067e98818fd7f3a983'}]}, 'timestamp': '2025-09-30 21:20:44.010453', '_unique_id': '38ce1a2cf5334eaab94d3a8d88c24533'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:20:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:20:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.251 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.251 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquired lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.251 2 DEBUG nova.network.neutron [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.409 2 DEBUG nova.network.neutron [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.695 2 DEBUG nova.network.neutron [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.719 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Releasing lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.859 2 DEBUG nova.virt.libvirt.driver [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.859 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Creating file /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/1ec6be458e284a09b5c2030a7f4e4912.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:20:44 compute-0 nova_compute[192810]: 2025-09-30 21:20:44.859 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/1ec6be458e284a09b5c2030a7f4e4912.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.442 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/1ec6be458e284a09b5c2030a7f4e4912.tmp" returned: 1 in 0.583s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.444 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/1ec6be458e284a09b5c2030a7f4e4912.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.444 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Creating directory /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.445 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:45 compute-0 sshd-session[224067]: Received disconnect from 45.81.23.80 port 55336:11: Bye Bye [preauth]
Sep 30 21:20:45 compute-0 sshd-session[224067]: Disconnected from invalid user icp 45.81.23.80 port 55336 [preauth]
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.681 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:45 compute-0 nova_compute[192810]: 2025-09-30 21:20:45.687 2 DEBUG nova.virt.libvirt.driver [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:20:46 compute-0 nova_compute[192810]: 2025-09-30 21:20:46.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-0 nova_compute[192810]: 2025-09-30 21:20:47.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:47 compute-0 podman[224078]: 2025-09-30 21:20:47.397016915 +0000 UTC m=+0.124100278 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 21:20:48 compute-0 nova_compute[192810]: 2025-09-30 21:20:48.516 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267233.5151753, 383bdd8f-b596-427e-b6ba-f3251a14c6dd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:20:48 compute-0 nova_compute[192810]: 2025-09-30 21:20:48.517 2 INFO nova.compute.manager [-] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] VM Stopped (Lifecycle Event)
Sep 30 21:20:48 compute-0 nova_compute[192810]: 2025-09-30 21:20:48.571 2 DEBUG nova.compute.manager [None req-746d4a3f-3804-4a9d-9acf-af19b5e08212 - - - - - -] [instance: 383bdd8f-b596-427e-b6ba-f3251a14c6dd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:20:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:50.972 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:20:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:50.973 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:20:51 compute-0 nova_compute[192810]: 2025-09-30 21:20:51.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:51 compute-0 nova_compute[192810]: 2025-09-30 21:20:51.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:52 compute-0 nova_compute[192810]: 2025-09-30 21:20:52.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:52 compute-0 podman[224118]: 2025-09-30 21:20:52.333635855 +0000 UTC m=+0.067575182 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:20:52 compute-0 podman[224117]: 2025-09-30 21:20:52.375717732 +0000 UTC m=+0.111692969 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:20:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:20:53.974 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:20:55 compute-0 nova_compute[192810]: 2025-09-30 21:20:55.735 2 DEBUG nova.virt.libvirt.driver [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:20:56 compute-0 nova_compute[192810]: 2025-09-30 21:20:56.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:57 compute-0 nova_compute[192810]: 2025-09-30 21:20:57.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:20:57 compute-0 podman[224159]: 2025-09-30 21:20:57.375589839 +0000 UTC m=+0.107918816 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Sep 30 21:20:57 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Sep 30 21:20:57 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001a.scope: Consumed 12.786s CPU time.
Sep 30 21:20:57 compute-0 systemd-machined[152794]: Machine qemu-11-instance-0000001a terminated.
Sep 30 21:20:58 compute-0 podman[224185]: 2025-09-30 21:20:58.090037651 +0000 UTC m=+0.112124440 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.750 2 INFO nova.virt.libvirt.driver [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance shutdown successfully after 13 seconds.
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.757 2 INFO nova.virt.libvirt.driver [-] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance destroyed successfully.
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.763 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.824 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.826 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.916 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.920 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk to 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:20:58 compute-0 nova_compute[192810]: 2025-09-30 21:20:58.920 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:20:59 compute-0 nova_compute[192810]: 2025-09-30 21:20:59.839 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk" returned: 0 in 0.919s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:20:59 compute-0 nova_compute[192810]: 2025-09-30 21:20:59.841 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:20:59 compute-0 nova_compute[192810]: 2025-09-30 21:20:59.841 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.config 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.109 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -C -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.config 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.config" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.111 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.111 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.info 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.368 2 DEBUG oslo_concurrency.processutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -C -r /var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8_resize/disk.info 192.168.122.102:/var/lib/nova/instances/56118128-4b8a-4d28-aeb3-585059dc4de8/disk.info" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.573 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "56118128-4b8a-4d28-aeb3-585059dc4de8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.573 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:00 compute-0 nova_compute[192810]: 2025-09-30 21:21:00.573 2 DEBUG oslo_concurrency.lockutils [None req-22ad4d33-148f-413a-804a-b766424c8a1f 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:01 compute-0 nova_compute[192810]: 2025-09-30 21:21:01.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.303 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.303 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.329 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.466 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.466 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.475 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.475 2 INFO nova.compute.claims [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.644 2 DEBUG nova.compute.provider_tree [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.657 2 DEBUG nova.scheduler.client.report [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.680 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.681 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.746 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.746 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.766 2 INFO nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.797 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.989 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.991 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.991 2 INFO nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating image(s)
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.991 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.992 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:02 compute-0 nova_compute[192810]: 2025-09-30 21:21:02.992 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.006 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.075 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.076 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.076 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.086 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.149 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.150 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.191 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.192 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.192 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.249 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.251 2 DEBUG nova.virt.disk.api [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Checking if we can resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.252 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.314 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.315 2 DEBUG nova.virt.disk.api [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Cannot resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.316 2 DEBUG nova.objects.instance [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'migration_context' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.323 2 DEBUG nova.policy [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fdcdab028fdf46d7ba6634c631d3d33c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54fb4b43d65542abbb73044c6d52da8a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:21:03 compute-0 podman[224237]: 2025-09-30 21:21:03.329120125 +0000 UTC m=+0.059144443 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.332 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.332 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Ensure instance console log exists: /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.333 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.333 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:03 compute-0 nova_compute[192810]: 2025-09-30 21:21:03.333 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.067 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "56118128-4b8a-4d28-aeb3-585059dc4de8" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.068 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.068 2 DEBUG nova.compute.manager [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Going to confirm migration 7 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.123 2 DEBUG nova.objects.instance [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'info_cache' on Instance uuid 56118128-4b8a-4d28-aeb3-585059dc4de8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.356 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.356 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.356 2 DEBUG nova.network.neutron [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:04 compute-0 nova_compute[192810]: 2025-09-30 21:21:04.519 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Successfully created port: 0ad4c684-4514-4550-b708-6339f933766c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.255 2 DEBUG nova.network.neutron [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.777 2 DEBUG nova.network.neutron [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.805 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-56118128-4b8a-4d28-aeb3-585059dc4de8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.806 2 DEBUG nova.objects.instance [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 56118128-4b8a-4d28-aeb3-585059dc4de8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.845 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.846 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.952 2 DEBUG nova.compute.provider_tree [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:05 compute-0 nova_compute[192810]: 2025-09-30 21:21:05.971 2 DEBUG nova.scheduler.client.report [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.019 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.241 2 INFO nova.scheduler.client.report [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Deleted allocation for migration a48c3ed4-1526-4483-8138-b3757d307455
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.333 2 DEBUG oslo_concurrency.lockutils [None req-27be3c95-3286-4828-804e-1b184a311652 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "56118128-4b8a-4d28-aeb3-585059dc4de8" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.396 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Successfully updated port: 0ad4c684-4514-4550-b708-6339f933766c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.418 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.419 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquired lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.419 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.500 2 DEBUG nova.compute.manager [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-changed-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.500 2 DEBUG nova.compute.manager [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Refreshing instance network info cache due to event network-changed-0ad4c684-4514-4550-b708-6339f933766c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.501 2 DEBUG oslo_concurrency.lockutils [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:06 compute-0 nova_compute[192810]: 2025-09-30 21:21:06.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:07 compute-0 nova_compute[192810]: 2025-09-30 21:21:07.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:07 compute-0 nova_compute[192810]: 2025-09-30 21:21:07.348 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.688 2 DEBUG nova.network.neutron [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updating instance_info_cache with network_info: [{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.728 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Releasing lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.729 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance network_info: |[{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.730 2 DEBUG oslo_concurrency.lockutils [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.730 2 DEBUG nova.network.neutron [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Refreshing network info cache for port 0ad4c684-4514-4550-b708-6339f933766c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.736 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start _get_guest_xml network_info=[{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.742 2 WARNING nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.758 2 DEBUG nova.virt.libvirt.host [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.759 2 DEBUG nova.virt.libvirt.host [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.763 2 DEBUG nova.virt.libvirt.host [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.764 2 DEBUG nova.virt.libvirt.host [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.766 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.767 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.768 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.769 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.769 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.770 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.770 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.771 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.771 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.772 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.772 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.773 2 DEBUG nova.virt.hardware [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.779 2 DEBUG nova.virt.libvirt.vif [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:02Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.780 2 DEBUG nova.network.os_vif_util [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.782 2 DEBUG nova.network.os_vif_util [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.783 2 DEBUG nova.objects.instance [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_devices' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.802 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <uuid>c79f1121-fcfb-4c07-94cf-1389e1df9e81</uuid>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <name>instance-0000001c</name>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdminTestJSON-server-988999069</nova:name>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:21:08</nova:creationTime>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:user uuid="fdcdab028fdf46d7ba6634c631d3d33c">tempest-ServersAdminTestJSON-302955212-project-member</nova:user>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:project uuid="54fb4b43d65542abbb73044c6d52da8a">tempest-ServersAdminTestJSON-302955212</nova:project>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         <nova:port uuid="0ad4c684-4514-4550-b708-6339f933766c">
Sep 30 21:21:08 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <system>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="serial">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="uuid">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </system>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <os>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </os>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <features>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </features>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:c1:12:42"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <target dev="tap0ad4c684-45"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log" append="off"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <video>
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </video>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:21:08 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:21:08 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:21:08 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:21:08 compute-0 nova_compute[192810]: </domain>
Sep 30 21:21:08 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.803 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Preparing to wait for external event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.804 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.804 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.805 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.806 2 DEBUG nova.virt.libvirt.vif [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:02Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.807 2 DEBUG nova.network.os_vif_util [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.808 2 DEBUG nova.network.os_vif_util [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.808 2 DEBUG os_vif [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.810 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.810 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ad4c684-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.816 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ad4c684-45, col_values=(('external_ids', {'iface-id': '0ad4c684-4514-4550-b708-6339f933766c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:12:42', 'vm-uuid': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:08 compute-0 NetworkManager[51733]: <info>  [1759267268.8200] manager: (tap0ad4c684-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.829 2 INFO os_vif [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.908 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.909 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.910 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No VIF found with MAC fa:16:3e:c1:12:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:08 compute-0 nova_compute[192810]: 2025-09-30 21:21:08.911 2 INFO nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Using config drive
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.633 2 INFO nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating config drive at /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.646 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2v_vhiaj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.789 2 DEBUG oslo_concurrency.processutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2v_vhiaj" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:09 compute-0 kernel: tap0ad4c684-45: entered promiscuous mode
Sep 30 21:21:09 compute-0 NetworkManager[51733]: <info>  [1759267269.8594] manager: (tap0ad4c684-45): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Sep 30 21:21:09 compute-0 ovn_controller[94912]: 2025-09-30T21:21:09Z|00117|binding|INFO|Claiming lport 0ad4c684-4514-4550-b708-6339f933766c for this chassis.
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:09 compute-0 ovn_controller[94912]: 2025-09-30T21:21:09Z|00118|binding|INFO|0ad4c684-4514-4550-b708-6339f933766c: Claiming fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.880 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.882 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb bound to our chassis
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.884 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:09 compute-0 systemd-udevd[224281]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:09 compute-0 systemd-machined[152794]: New machine qemu-12-instance-0000001c.
Sep 30 21:21:09 compute-0 NetworkManager[51733]: <info>  [1759267269.9020] device (tap0ad4c684-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.901 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5d1b299c-b6f5-4cd3-b9e3-c0f96253a5b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.902 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7ae61a95-11 in ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:21:09 compute-0 NetworkManager[51733]: <info>  [1759267269.9035] device (tap0ad4c684-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.906 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7ae61a95-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.906 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[670d7206-f865-4994-b487-fa1e9f2eac36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.908 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4730d8-f7bf-44e4-b76f-bdc16c66cbf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.920 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5950df3e-ad52-4099-b7d3-fd1499454e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000001c.
Sep 30 21:21:09 compute-0 ovn_controller[94912]: 2025-09-30T21:21:09Z|00119|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c ovn-installed in OVS
Sep 30 21:21:09 compute-0 ovn_controller[94912]: 2025-09-30T21:21:09Z|00120|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c up in Southbound
Sep 30 21:21:09 compute-0 nova_compute[192810]: 2025-09-30 21:21:09.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.946 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[77c6a8b3-a818-418b-b83b-29c5ef24eee1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.972 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1a46a042-9f1c-4d41-b275-cb560dfade7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:09.979 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0d138e39-066a-4334-91bd-21a5d7dc8734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:09 compute-0 systemd-udevd[224285]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:09 compute-0 NetworkManager[51733]: <info>  [1759267269.9815] manager: (tap7ae61a95-10): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.010 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3c20bb24-3426-443c-88a1-385e9f60de03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.013 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6a543e58-c37b-47e0-8524-e5bb1b8bae8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 NetworkManager[51733]: <info>  [1759267270.0380] device (tap7ae61a95-10): carrier: link connected
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.043 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[966c59ec-68a4-4518-b05f-192afaab6493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.063 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[521fbd5d-6368-420c-9deb-d9480a1f790c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224315, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.087 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3e7f3eda-aad1-4991-b78f-c34d239b6957]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:1ac2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396566, 'tstamp': 396566}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224316, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.115 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[10b94846-6e02-4862-a6a2-2374f7491d86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224319, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.157 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[86bfa4aa-88e1-4766-a5d6-e3cf08cd224a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.259 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5f45a9ff-38c7-4198-a440-ba8972769eae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.261 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.262 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.263 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:10 compute-0 NetworkManager[51733]: <info>  [1759267270.2660] manager: (tap7ae61a95-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:10 compute-0 kernel: tap7ae61a95-10: entered promiscuous mode
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.272 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:10 compute-0 ovn_controller[94912]: 2025-09-30T21:21:10Z|00121|binding|INFO|Releasing lport 49c3a39c-1855-4432-8459-a8f7bc7429b4 from this chassis (sb_readonly=0)
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.279 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7ae61a95-12e5-48ee-93a0-85e12f8652eb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7ae61a95-12e5-48ee-93a0-85e12f8652eb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.280 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe6bdb9-64fa-4d65-8d56-ca29c6f0824b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.280 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/7ae61a95-12e5-48ee-93a0-85e12f8652eb.pid.haproxy
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:21:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:10.281 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'env', 'PROCESS_TAG=haproxy-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7ae61a95-12e5-48ee-93a0-85e12f8652eb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.626 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267270.6260955, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.627 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Started (Lifecycle Event)
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.658 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.662 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267270.6272683, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.662 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Paused (Lifecycle Event)
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.691 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:10 compute-0 podman[224355]: 2025-09-30 21:21:10.694462681 +0000 UTC m=+0.049554954 container create d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.697 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:10 compute-0 nova_compute[192810]: 2025-09-30 21:21:10.732 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:10 compute-0 systemd[1]: Started libpod-conmon-d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744.scope.
Sep 30 21:21:10 compute-0 podman[224355]: 2025-09-30 21:21:10.669781727 +0000 UTC m=+0.024873960 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:21:10 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:21:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ccc6a3900813832a587c2b27b5dfc3b3d0b4d0a098f4a9e4232685c8017ad3c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:21:10 compute-0 podman[224355]: 2025-09-30 21:21:10.795064064 +0000 UTC m=+0.150156387 container init d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:21:10 compute-0 podman[224355]: 2025-09-30 21:21:10.806412476 +0000 UTC m=+0.161504739 container start d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:21:10 compute-0 podman[224371]: 2025-09-30 21:21:10.813699437 +0000 UTC m=+0.072466183 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:21:10 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [NOTICE]   (224418) : New worker (224420) forked
Sep 30 21:21:10 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [NOTICE]   (224418) : Loading success.
Sep 30 21:21:10 compute-0 podman[224368]: 2025-09-30 21:21:10.831777967 +0000 UTC m=+0.091692012 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.311 2 DEBUG nova.compute.manager [req-6f4b440b-c10e-4a36-b107-804344be8331 req-fa54ecbd-68dc-4a1f-a6dc-a5d979fd71d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.312 2 DEBUG oslo_concurrency.lockutils [req-6f4b440b-c10e-4a36-b107-804344be8331 req-fa54ecbd-68dc-4a1f-a6dc-a5d979fd71d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.313 2 DEBUG oslo_concurrency.lockutils [req-6f4b440b-c10e-4a36-b107-804344be8331 req-fa54ecbd-68dc-4a1f-a6dc-a5d979fd71d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.313 2 DEBUG oslo_concurrency.lockutils [req-6f4b440b-c10e-4a36-b107-804344be8331 req-fa54ecbd-68dc-4a1f-a6dc-a5d979fd71d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.314 2 DEBUG nova.compute.manager [req-6f4b440b-c10e-4a36-b107-804344be8331 req-fa54ecbd-68dc-4a1f-a6dc-a5d979fd71d1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Processing event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.316 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.320 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267271.3205428, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.321 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Resumed (Lifecycle Event)
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.324 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.330 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance spawned successfully.
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.330 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.432 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.436 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.450 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.451 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.451 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.452 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.453 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.453 2 DEBUG nova.virt.libvirt.driver [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.460 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.587 2 INFO nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Took 8.60 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.588 2 DEBUG nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.696 2 INFO nova.compute.manager [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Took 9.27 seconds to build instance.
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.743 2 DEBUG nova.network.neutron [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updated VIF entry in instance network info cache for port 0ad4c684-4514-4550-b708-6339f933766c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.744 2 DEBUG nova.network.neutron [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updating instance_info_cache with network_info: [{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.760 2 DEBUG oslo_concurrency.lockutils [None req-163d951d-57c4-4673-9328-17798b36dfdf fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:11 compute-0 nova_compute[192810]: 2025-09-30 21:21:11.766 2 DEBUG oslo_concurrency.lockutils [req-b9096eaf-507c-4da4-8d89-a622c120a5b9 req-a616323d-333a-45fd-868d-d634ee97983c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.176 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267258.1758032, 56118128-4b8a-4d28-aeb3-585059dc4de8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.178 2 INFO nova.compute.manager [-] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] VM Stopped (Lifecycle Event)
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.211 2 DEBUG nova.compute.manager [None req-d0f4bc55-be07-4306-9616-eeef4b3488ea - - - - - -] [instance: 56118128-4b8a-4d28-aeb3-585059dc4de8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.440 2 DEBUG nova.compute.manager [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.442 2 DEBUG oslo_concurrency.lockutils [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.443 2 DEBUG oslo_concurrency.lockutils [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.443 2 DEBUG oslo_concurrency.lockutils [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.443 2 DEBUG nova.compute.manager [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.444 2 WARNING nova.compute.manager [req-8485fefc-7fae-4747-999d-73dbc4b3353c req-d3b8e161-c451-42d0-afb2-6604402e79f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state active and task_state None.
Sep 30 21:21:13 compute-0 nova_compute[192810]: 2025-09-30 21:21:13.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:16 compute-0 nova_compute[192810]: 2025-09-30 21:21:16.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.157 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.159 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.184 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.315 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.316 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.325 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.326 2 INFO nova.compute.claims [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:21:18 compute-0 podman[224429]: 2025-09-30 21:21:18.362040135 +0000 UTC m=+0.085812085 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.483 2 DEBUG nova.compute.provider_tree [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.501 2 DEBUG nova.scheduler.client.report [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.538 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.539 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.609 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.610 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.639 2 INFO nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.663 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.850 2 DEBUG nova.policy [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fdcdab028fdf46d7ba6634c631d3d33c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54fb4b43d65542abbb73044c6d52da8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.886 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.887 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.887 2 INFO nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Creating image(s)
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.888 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.888 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.889 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.904 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.971 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.972 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.972 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:18 compute-0 nova_compute[192810]: 2025-09-30 21:21:18.983 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.041 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.042 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.095 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.096 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.097 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.168 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.170 2 DEBUG nova.virt.disk.api [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Checking if we can resize image /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.170 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.226 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.228 2 DEBUG nova.virt.disk.api [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Cannot resize image /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.228 2 DEBUG nova.objects.instance [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'migration_context' on Instance uuid 3d12b579-88e1-415f-b0a2-0e8b17628076 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.240 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.241 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Ensure instance console log exists: /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.241 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.242 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.242 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:19 compute-0 nova_compute[192810]: 2025-09-30 21:21:19.669 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Successfully created port: 483ff4db-c136-4dea-b809-0029e29cee54 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:21:20 compute-0 nova_compute[192810]: 2025-09-30 21:21:20.952 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Successfully updated port: 483ff4db-c136-4dea-b809-0029e29cee54 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:21:20 compute-0 nova_compute[192810]: 2025-09-30 21:21:20.967 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:20 compute-0 nova_compute[192810]: 2025-09-30 21:21:20.968 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquired lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:20 compute-0 nova_compute[192810]: 2025-09-30 21:21:20.968 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:21 compute-0 nova_compute[192810]: 2025-09-30 21:21:21.155 2 DEBUG nova.compute.manager [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-changed-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:21 compute-0 nova_compute[192810]: 2025-09-30 21:21:21.155 2 DEBUG nova.compute.manager [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Refreshing instance network info cache due to event network-changed-483ff4db-c136-4dea-b809-0029e29cee54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:21 compute-0 nova_compute[192810]: 2025-09-30 21:21:21.156 2 DEBUG oslo_concurrency.lockutils [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:21 compute-0 nova_compute[192810]: 2025-09-30 21:21:21.209 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:21 compute-0 nova_compute[192810]: 2025-09-30 21:21:21.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.208 2 DEBUG nova.network.neutron [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Updating instance_info_cache with network_info: [{"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.241 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Releasing lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.242 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Instance network_info: |[{"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.244 2 DEBUG oslo_concurrency.lockutils [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.244 2 DEBUG nova.network.neutron [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Refreshing network info cache for port 483ff4db-c136-4dea-b809-0029e29cee54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.249 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Start _get_guest_xml network_info=[{"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.258 2 WARNING nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.270 2 DEBUG nova.virt.libvirt.host [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.271 2 DEBUG nova.virt.libvirt.host [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.275 2 DEBUG nova.virt.libvirt.host [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.276 2 DEBUG nova.virt.libvirt.host [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.278 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.279 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.280 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.280 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.281 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.281 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.282 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.282 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.283 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.284 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.284 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.285 2 DEBUG nova.virt.hardware [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.292 2 DEBUG nova.virt.libvirt.vif [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-965591303',display_name='tempest-ServersAdminTestJSON-server-965591303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-965591303',id=31,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-hsly4ojy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:18Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=3d12b579-88e1-415f-b0a2-0e8b17628076,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.293 2 DEBUG nova.network.os_vif_util [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.295 2 DEBUG nova.network.os_vif_util [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.296 2 DEBUG nova.objects.instance [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_devices' on Instance uuid 3d12b579-88e1-415f-b0a2-0e8b17628076 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.321 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <uuid>3d12b579-88e1-415f-b0a2-0e8b17628076</uuid>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <name>instance-0000001f</name>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdminTestJSON-server-965591303</nova:name>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:21:22</nova:creationTime>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:user uuid="fdcdab028fdf46d7ba6634c631d3d33c">tempest-ServersAdminTestJSON-302955212-project-member</nova:user>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:project uuid="54fb4b43d65542abbb73044c6d52da8a">tempest-ServersAdminTestJSON-302955212</nova:project>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         <nova:port uuid="483ff4db-c136-4dea-b809-0029e29cee54">
Sep 30 21:21:22 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <system>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="serial">3d12b579-88e1-415f-b0a2-0e8b17628076</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="uuid">3d12b579-88e1-415f-b0a2-0e8b17628076</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </system>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <os>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </os>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <features>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </features>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.config"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:60:29:61"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <target dev="tap483ff4db-c1"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/console.log" append="off"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <video>
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </video>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:21:22 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:21:22 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:21:22 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:21:22 compute-0 nova_compute[192810]: </domain>
Sep 30 21:21:22 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.332 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Preparing to wait for external event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.333 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.333 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.334 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.335 2 DEBUG nova.virt.libvirt.vif [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-965591303',display_name='tempest-ServersAdminTestJSON-server-965591303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-965591303',id=31,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-hsly4ojy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:18Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=3d12b579-88e1-415f-b0a2-0e8b17628076,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.335 2 DEBUG nova.network.os_vif_util [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:22 compute-0 ovn_controller[94912]: 2025-09-30T21:21:22Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.338 2 DEBUG nova.network.os_vif_util [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:22 compute-0 ovn_controller[94912]: 2025-09-30T21:21:22Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.339 2 DEBUG os_vif [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.347 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap483ff4db-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.348 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap483ff4db-c1, col_values=(('external_ids', {'iface-id': '483ff4db-c136-4dea-b809-0029e29cee54', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:60:29:61', 'vm-uuid': '3d12b579-88e1-415f-b0a2-0e8b17628076'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-0 NetworkManager[51733]: <info>  [1759267282.3670] manager: (tap483ff4db-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.374 2 INFO os_vif [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1')
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.446 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.447 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.447 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No VIF found with MAC fa:16:3e:60:29:61, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.448 2 INFO nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Using config drive
Sep 30 21:21:22 compute-0 podman[224479]: 2025-09-30 21:21:22.471143912 +0000 UTC m=+0.065009899 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:21:22 compute-0 podman[224480]: 2025-09-30 21:21:22.494103023 +0000 UTC m=+0.074727530 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.857 2 INFO nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Creating config drive at /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.config
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.863 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9hpe4g6r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:22 compute-0 nova_compute[192810]: 2025-09-30 21:21:22.991 2 DEBUG oslo_concurrency.processutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9hpe4g6r" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:23 compute-0 kernel: tap483ff4db-c1: entered promiscuous mode
Sep 30 21:21:23 compute-0 NetworkManager[51733]: <info>  [1759267283.0659] manager: (tap483ff4db-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Sep 30 21:21:23 compute-0 ovn_controller[94912]: 2025-09-30T21:21:23Z|00122|binding|INFO|Claiming lport 483ff4db-c136-4dea-b809-0029e29cee54 for this chassis.
Sep 30 21:21:23 compute-0 ovn_controller[94912]: 2025-09-30T21:21:23Z|00123|binding|INFO|483ff4db-c136-4dea-b809-0029e29cee54: Claiming fa:16:3e:60:29:61 10.100.0.13
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.083 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:29:61 10.100.0.13'], port_security=['fa:16:3e:60:29:61 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3d12b579-88e1-415f-b0a2-0e8b17628076', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=483ff4db-c136-4dea-b809-0029e29cee54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.085 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 483ff4db-c136-4dea-b809-0029e29cee54 in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb bound to our chassis
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.088 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:23 compute-0 ovn_controller[94912]: 2025-09-30T21:21:23Z|00124|binding|INFO|Setting lport 483ff4db-c136-4dea-b809-0029e29cee54 ovn-installed in OVS
Sep 30 21:21:23 compute-0 ovn_controller[94912]: 2025-09-30T21:21:23Z|00125|binding|INFO|Setting lport 483ff4db-c136-4dea-b809-0029e29cee54 up in Southbound
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.118 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f60d644e-1bfd-4797-9c2c-6dc80e0b14ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 systemd-udevd[224540]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:23 compute-0 systemd-machined[152794]: New machine qemu-13-instance-0000001f.
Sep 30 21:21:23 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000001f.
Sep 30 21:21:23 compute-0 NetworkManager[51733]: <info>  [1759267283.1527] device (tap483ff4db-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.151 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[51364bb7-3fd6-42d9-ad29-adb1abcac2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 NetworkManager[51733]: <info>  [1759267283.1536] device (tap483ff4db-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.157 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9d366d-cd02-470b-9df2-1ac87c222d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.198 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef70862-d4d6-4f0f-9f35-e2fe8533d8e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.225 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f4bc5d20-37f1-4df7-9fad-3ef4c9c072fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224550, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.254 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6f54b65c-ad65-463d-bb1e-6af933746c4e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224553, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224553, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.258 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.264 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.264 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.265 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:23.266 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.650 2 DEBUG nova.network.neutron [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Updated VIF entry in instance network info cache for port 483ff4db-c136-4dea-b809-0029e29cee54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.650 2 DEBUG nova.network.neutron [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Updating instance_info_cache with network_info: [{"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:23 compute-0 nova_compute[192810]: 2025-09-30 21:21:23.690 2 DEBUG oslo_concurrency.lockutils [req-c2ca30f8-6124-4e49-92d9-e7199ec0d9b1 req-941d8caa-57ce-42c0-bb81-73b6a4fef15b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3d12b579-88e1-415f-b0a2-0e8b17628076" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.009 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267284.00874, 3d12b579-88e1-415f-b0a2-0e8b17628076 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.009 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] VM Started (Lifecycle Event)
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.029 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.033 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267284.011109, 3d12b579-88e1-415f-b0a2-0e8b17628076 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.033 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] VM Paused (Lifecycle Event)
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.067 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.070 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.125 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.364 2 DEBUG nova.compute.manager [req-966ccee4-b9d9-4d61-b5de-42a679a69e1c req-b4a2a8ee-a703-4c19-ae7e-b64f4c458a69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.364 2 DEBUG oslo_concurrency.lockutils [req-966ccee4-b9d9-4d61-b5de-42a679a69e1c req-b4a2a8ee-a703-4c19-ae7e-b64f4c458a69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.365 2 DEBUG oslo_concurrency.lockutils [req-966ccee4-b9d9-4d61-b5de-42a679a69e1c req-b4a2a8ee-a703-4c19-ae7e-b64f4c458a69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.365 2 DEBUG oslo_concurrency.lockutils [req-966ccee4-b9d9-4d61-b5de-42a679a69e1c req-b4a2a8ee-a703-4c19-ae7e-b64f4c458a69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.366 2 DEBUG nova.compute.manager [req-966ccee4-b9d9-4d61-b5de-42a679a69e1c req-b4a2a8ee-a703-4c19-ae7e-b64f4c458a69 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Processing event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.366 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.370 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267284.3703296, 3d12b579-88e1-415f-b0a2-0e8b17628076 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.371 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] VM Resumed (Lifecycle Event)
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.372 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.376 2 INFO nova.virt.libvirt.driver [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Instance spawned successfully.
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.376 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.418 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.422 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.422 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.423 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.423 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.423 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.424 2 DEBUG nova.virt.libvirt.driver [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.427 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.611 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.682 2 INFO nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Took 5.80 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.682 2 DEBUG nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.809 2 INFO nova.compute.manager [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Took 6.54 seconds to build instance.
Sep 30 21:21:24 compute-0 nova_compute[192810]: 2025-09-30 21:21:24.831 2 DEBUG oslo_concurrency.lockutils [None req-746c50d7-70d7-48fe-b887-bc8b4d1b86ea fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.470 2 DEBUG nova.compute.manager [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.470 2 DEBUG oslo_concurrency.lockutils [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.471 2 DEBUG oslo_concurrency.lockutils [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.471 2 DEBUG oslo_concurrency.lockutils [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.471 2 DEBUG nova.compute.manager [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] No waiting events found dispatching network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.471 2 WARNING nova.compute.manager [req-a2850b9c-9bfe-46d0-86b1-30cb6f974e26 req-28bf4a60-e1db-4c0f-bcd2-4f4ba483a98e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received unexpected event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 for instance with vm_state active and task_state None.
Sep 30 21:21:26 compute-0 nova_compute[192810]: 2025-09-30 21:21:26.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:27 compute-0 nova_compute[192810]: 2025-09-30 21:21:27.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:28 compute-0 podman[224563]: 2025-09-30 21:21:28.336133716 +0000 UTC m=+0.068028484 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm)
Sep 30 21:21:28 compute-0 podman[224562]: 2025-09-30 21:21:28.394405925 +0000 UTC m=+0.121957705 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:21:30 compute-0 nova_compute[192810]: 2025-09-30 21:21:30.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:30.610 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:30.611 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:21:31 compute-0 nova_compute[192810]: 2025-09-30 21:21:31.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.157 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.158 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.181 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.284 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.284 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.296 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.296 2 INFO nova.compute.claims [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.457 2 DEBUG nova.compute.provider_tree [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.476 2 DEBUG nova.scheduler.client.report [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.531 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.531 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.595 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.596 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.617 2 INFO nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.666 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.770 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.772 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.772 2 INFO nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Creating image(s)
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.773 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.773 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.774 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.788 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.847 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.848 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.848 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.859 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.883 2 DEBUG nova.policy [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c536ea061a32492a8c5e6bf941d1c9f3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47d2c796445c4dd3affc8594502f04be', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.920 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.920 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.957 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.959 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:32 compute-0 nova_compute[192810]: 2025-09-30 21:21:32.959 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.017 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.018 2 DEBUG nova.virt.disk.api [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Checking if we can resize image /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.018 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.071 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.072 2 DEBUG nova.virt.disk.api [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Cannot resize image /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.072 2 DEBUG nova.objects.instance [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'migration_context' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.095 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.096 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Ensure instance console log exists: /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.096 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.097 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.097 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:33 compute-0 nova_compute[192810]: 2025-09-30 21:21:33.757 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Successfully created port: ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:21:34 compute-0 podman[224623]: 2025-09-30 21:21:34.33170055 +0000 UTC m=+0.067037430 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 21:21:34 compute-0 unix_chkpwd[224642]: password check failed for user (root)
Sep 30 21:21:34 compute-0 sshd-session[224621]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.729 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Successfully updated port: ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.749 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.749 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.750 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.896 2 DEBUG nova.compute.manager [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.896 2 DEBUG nova.compute.manager [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing instance network info cache due to event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.896 2 DEBUG oslo_concurrency.lockutils [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:34 compute-0 nova_compute[192810]: 2025-09-30 21:21:34.980 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:21:36 compute-0 nova_compute[192810]: 2025-09-30 21:21:36.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:36.614 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:36 compute-0 nova_compute[192810]: 2025-09-30 21:21:36.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:36 compute-0 nova_compute[192810]: 2025-09-30 21:21:36.804 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:36 compute-0 nova_compute[192810]: 2025-09-30 21:21:36.805 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:36 compute-0 sshd-session[224621]: Failed password for root from 45.81.23.80 port 50184 ssh2
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:37 compute-0 ovn_controller[94912]: 2025-09-30T21:21:37Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:60:29:61 10.100.0.13
Sep 30 21:21:37 compute-0 ovn_controller[94912]: 2025-09-30T21:21:37Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:60:29:61 10.100.0.13
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.966 2 DEBUG nova.network.neutron [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.985 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.985 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Instance network_info: |[{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.986 2 DEBUG oslo_concurrency.lockutils [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.987 2 DEBUG nova.network.neutron [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.991 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Start _get_guest_xml network_info=[{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:37 compute-0 nova_compute[192810]: 2025-09-30 21:21:37.998 2 WARNING nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.006 2 DEBUG nova.virt.libvirt.host [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.008 2 DEBUG nova.virt.libvirt.host [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.018 2 DEBUG nova.virt.libvirt.host [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.019 2 DEBUG nova.virt.libvirt.host [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.021 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.022 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.023 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.023 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.024 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.024 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.025 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.025 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.026 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.026 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.026 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.027 2 DEBUG nova.virt.hardware [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.034 2 DEBUG nova.virt.libvirt.vif [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.035 2 DEBUG nova.network.os_vif_util [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.036 2 DEBUG nova.network.os_vif_util [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.038 2 DEBUG nova.objects.instance [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'pci_devices' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.055 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <uuid>370a976e-b51f-4187-bf76-bf6cbcae956b</uuid>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <name>instance-00000022</name>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:name>tempest-tempest.common.compute-instance-1609111197</nova:name>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:21:37</nova:creationTime>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         <nova:port uuid="ce9edf2f-fd59-406b-b9c3-5b314d995fe9">
Sep 30 21:21:38 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <system>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="serial">370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="uuid">370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </system>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <os>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </os>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <features>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </features>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:99:d9:88"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <target dev="tapce9edf2f-fd"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log" append="off"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <video>
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </video>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:21:38 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:21:38 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:21:38 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:21:38 compute-0 nova_compute[192810]: </domain>
Sep 30 21:21:38 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.056 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Preparing to wait for external event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.056 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.057 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.057 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.057 2 DEBUG nova.virt.libvirt.vif [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.058 2 DEBUG nova.network.os_vif_util [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.058 2 DEBUG nova.network.os_vif_util [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.058 2 DEBUG os_vif [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.059 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.060 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.063 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce9edf2f-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.064 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapce9edf2f-fd, col_values=(('external_ids', {'iface-id': 'ce9edf2f-fd59-406b-b9c3-5b314d995fe9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:99:d9:88', 'vm-uuid': '370a976e-b51f-4187-bf76-bf6cbcae956b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:38 compute-0 NetworkManager[51733]: <info>  [1759267298.1154] manager: (tapce9edf2f-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.122 2 INFO os_vif [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd')
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.201 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.201 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.201 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:99:d9:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:38 compute-0 nova_compute[192810]: 2025-09-30 21:21:38.202 2 INFO nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Using config drive
Sep 30 21:21:38 compute-0 sshd-session[224621]: Received disconnect from 45.81.23.80 port 50184:11: Bye Bye [preauth]
Sep 30 21:21:38 compute-0 sshd-session[224621]: Disconnected from authenticating user root 45.81.23.80 port 50184 [preauth]
Sep 30 21:21:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:38.725 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:38.726 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:38.728 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.341 2 INFO nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Creating config drive at /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.347 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzezafhj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.477 2 DEBUG oslo_concurrency.processutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfzezafhj" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5300] manager: (tapce9edf2f-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Sep 30 21:21:40 compute-0 kernel: tapce9edf2f-fd: entered promiscuous mode
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00126|binding|INFO|Claiming lport ce9edf2f-fd59-406b-b9c3-5b314d995fe9 for this chassis.
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00127|binding|INFO|ce9edf2f-fd59-406b-b9c3-5b314d995fe9: Claiming fa:16:3e:99:d9:88 10.100.0.4
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5498] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/60)
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5503] device (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5519] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/61)
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5523] device (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5533] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5540] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5546] device (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5549] device (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.560 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:d9:88 10.100.0.4'], port_security=['fa:16:3e:99:d9:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '370a976e-b51f-4187-bf76-bf6cbcae956b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '2', 'neutron:security_group_ids': '69ef8d59-f66f-4bdf-9235-591ecebd585e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ce9edf2f-fd59-406b-b9c3-5b314d995fe9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.561 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 bound to our chassis
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.563 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:40 compute-0 systemd-udevd[224682]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5767] device (tapce9edf2f-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.5777] device (tapce9edf2f-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.576 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[77985ae8-e1d1-4bdb-b07d-172512d65cca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.578 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap29d3fdc6-d1 in ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:21:40 compute-0 systemd-machined[152794]: New machine qemu-14-instance-00000022.
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.580 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap29d3fdc6-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.580 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c5c033-0640-4369-9e27-f08577cb02ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.581 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[12fc973f-d01f-47f5-bc22-f3603cb86f14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.595 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7d5aa853-0156-4e88-904c-a5dde2c23a82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.623 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[53ac77f9-f78e-4fc8-9764-4758daaaadbb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.649 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a2810aca-60ea-4fa9-a892-c4baf8158cf0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.664 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[88ce9694-11bf-48a8-84d5-4e67765b2d72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.6657] manager: (tap29d3fdc6-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Sep 30 21:21:40 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000022.
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00128|binding|INFO|Releasing lport 49c3a39c-1855-4432-8459-a8f7bc7429b4 from this chassis (sb_readonly=0)
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.699 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0776c7c5-a099-4986-ad29-befeb199ec5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.702 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[db6f0c6d-c037-43ea-8960-4f14fc1d0371]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00129|binding|INFO|Setting lport ce9edf2f-fd59-406b-b9c3-5b314d995fe9 ovn-installed in OVS
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00130|binding|INFO|Setting lport ce9edf2f-fd59-406b-b9c3-5b314d995fe9 up in Southbound
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.7269] device (tap29d3fdc6-d0): carrier: link connected
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.731 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[99eb66ff-85d4-4208-b688-38f847156eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.749 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cf34dd55-f5bf-4d07-81de-562b994d213e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399635, 'reachable_time': 15623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224716, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.766 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[43788ccb-3386-480c-85bb-3312cf64e368]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe10:f7fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399635, 'tstamp': 399635}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224717, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.784 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2db4b198-345f-43b8-b033-cded72a3fdde]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399635, 'reachable_time': 15623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224718, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.822 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4eba7f77-36fa-42b0-8e62-709747215db4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.826 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.827 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.827 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.827 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.899 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb46398-e6c7-4c9b-ada0-59c8a7d1acc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.901 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.901 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.902 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:40 compute-0 NetworkManager[51733]: <info>  [1759267300.9038] manager: (tap29d3fdc6-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Sep 30 21:21:40 compute-0 kernel: tap29d3fdc6-d0: entered promiscuous mode
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.907 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 ovn_controller[94912]: 2025-09-30T21:21:40Z|00131|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.922 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.923 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b809a4f5-9a19-4591-9a0e-64619a8049e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.924 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/29d3fdc6-d8e1-4032-8f0c-e91da2912153.pid.haproxy
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.925 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:40.926 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'env', 'PROCESS_TAG=haproxy-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/29d3fdc6-d8e1-4032-8f0c-e91da2912153.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.990 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:40 compute-0 nova_compute[192810]: 2025-09-30 21:21:40.991 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.022 2 DEBUG nova.compute.manager [req-0fae433b-b276-4422-a99d-96f36713f99d req-cb3b4f5e-62a6-4e00-baea-6f47cedf6c26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.023 2 DEBUG oslo_concurrency.lockutils [req-0fae433b-b276-4422-a99d-96f36713f99d req-cb3b4f5e-62a6-4e00-baea-6f47cedf6c26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.024 2 DEBUG oslo_concurrency.lockutils [req-0fae433b-b276-4422-a99d-96f36713f99d req-cb3b4f5e-62a6-4e00-baea-6f47cedf6c26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.024 2 DEBUG oslo_concurrency.lockutils [req-0fae433b-b276-4422-a99d-96f36713f99d req-cb3b4f5e-62a6-4e00-baea-6f47cedf6c26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.025 2 DEBUG nova.compute.manager [req-0fae433b-b276-4422-a99d-96f36713f99d req-cb3b4f5e-62a6-4e00-baea-6f47cedf6c26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Processing event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.080 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.089 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.159 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.160 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.218 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.227 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.289 2 DEBUG nova.network.neutron [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updated VIF entry in instance network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.290 2 DEBUG nova.network.neutron [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.307 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.307 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:41 compute-0 podman[224770]: 2025-09-30 21:21:41.310898174 +0000 UTC m=+0.056595840 container create 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:21:41 compute-0 podman[224777]: 2025-09-30 21:21:41.332005049 +0000 UTC m=+0.062728172 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, name=ubi9-minimal, distribution-scope=public, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:21:41 compute-0 podman[224771]: 2025-09-30 21:21:41.343707991 +0000 UTC m=+0.067637785 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.350 2 DEBUG oslo_concurrency.lockutils [req-68cffd4d-a366-4141-bdf2-e97613376593 req-c18ced8d-244d-42b0-8613-d569330634bd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:41 compute-0 systemd[1]: Started libpod-conmon-69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172.scope.
Sep 30 21:21:41 compute-0 podman[224770]: 2025-09-30 21:21:41.275095583 +0000 UTC m=+0.020793279 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:21:41 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.385 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5b1b59b26a28491a52d35aee35638e3df8b2a50f12eed8a0e32f3b8650d6797/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:21:41 compute-0 podman[224770]: 2025-09-30 21:21:41.403489679 +0000 UTC m=+0.149187355 container init 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:21:41 compute-0 podman[224770]: 2025-09-30 21:21:41.40833089 +0000 UTC m=+0.154028556 container start 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:21:41 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [NOTICE]   (224836) : New worker (224838) forked
Sep 30 21:21:41 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [NOTICE]   (224836) : Loading success.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.526 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267301.5265098, 370a976e-b51f-4187-bf76-bf6cbcae956b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.527 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] VM Started (Lifecycle Event)
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.530 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.533 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.536 2 INFO nova.virt.libvirt.driver [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Instance spawned successfully.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.536 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.583 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.584 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5382MB free_disk=73.40446853637695GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.585 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.585 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.604 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.612 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.616 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.616 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.616 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.617 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.617 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.617 2 DEBUG nova.virt.libvirt.driver [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.686 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.686 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267301.5275414, 370a976e-b51f-4187-bf76-bf6cbcae956b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.686 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] VM Paused (Lifecycle Event)
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.745 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.748 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267301.5342443, 370a976e-b51f-4187-bf76-bf6cbcae956b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.749 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] VM Resumed (Lifecycle Event)
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.776 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.777 2 INFO nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Took 9.01 seconds to spawn the instance on the hypervisor.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.778 2 DEBUG nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.781 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.815 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.905 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance c79f1121-fcfb-4c07-94cf-1389e1df9e81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.905 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 3d12b579-88e1-415f-b0a2-0e8b17628076 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.906 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 370a976e-b51f-4187-bf76-bf6cbcae956b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.906 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.906 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.913 2 INFO nova.compute.manager [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Took 9.66 seconds to build instance.
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.929 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.941 2 DEBUG oslo_concurrency.lockutils [None req-04430d7b-b0d4-4d48-8293-20e05babae73 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.944 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.945 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.957 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:21:41 compute-0 nova_compute[192810]: 2025-09-30 21:21:41.994 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:21:42 compute-0 nova_compute[192810]: 2025-09-30 21:21:42.074 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:21:42 compute-0 nova_compute[192810]: 2025-09-30 21:21:42.090 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:21:42 compute-0 nova_compute[192810]: 2025-09-30 21:21:42.131 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:21:42 compute-0 nova_compute[192810]: 2025-09-30 21:21:42.131 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:42 compute-0 ovn_controller[94912]: 2025-09-30T21:21:42Z|00132|binding|INFO|Releasing lport 49c3a39c-1855-4432-8459-a8f7bc7429b4 from this chassis (sb_readonly=0)
Sep 30 21:21:42 compute-0 ovn_controller[94912]: 2025-09-30T21:21:42Z|00133|binding|INFO|Releasing lport ad63e4cf-251e-40e7-aea0-9713eaa58a32 from this chassis (sb_readonly=0)
Sep 30 21:21:42 compute-0 nova_compute[192810]: 2025-09-30 21:21:42.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.208 2 DEBUG nova.compute.manager [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.208 2 DEBUG oslo_concurrency.lockutils [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.208 2 DEBUG oslo_concurrency.lockutils [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.209 2 DEBUG oslo_concurrency.lockutils [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.209 2 DEBUG nova.compute.manager [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:43 compute-0 nova_compute[192810]: 2025-09-30 21:21:43.209 2 WARNING nova.compute.manager [req-3c538ad3-fcb7-40c6-a25f-1094c65e4ff4 req-b232048f-0914-476e-9437-cb56810305d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 for instance with vm_state active and task_state None.
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.132 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.133 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.134 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.379 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.380 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.380 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:21:44 compute-0 nova_compute[192810]: 2025-09-30 21:21:44.380 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.468 2 INFO nova.compute.manager [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Rebuilding instance
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.857 2 DEBUG nova.compute.manager [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.952 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_requests' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.966 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_devices' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.977 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'resources' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:45 compute-0 nova_compute[192810]: 2025-09-30 21:21:45.990 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'migration_context' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.001 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.005 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.200 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updating instance_info_cache with network_info: [{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.220 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-c79f1121-fcfb-4c07-94cf-1389e1df9e81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.221 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.221 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:21:46 compute-0 nova_compute[192810]: 2025-09-30 21:21:46.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:47 compute-0 nova_compute[192810]: 2025-09-30 21:21:47.443 2 DEBUG nova.compute.manager [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:47 compute-0 nova_compute[192810]: 2025-09-30 21:21:47.443 2 DEBUG nova.compute.manager [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing instance network info cache due to event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:47 compute-0 nova_compute[192810]: 2025-09-30 21:21:47.443 2 DEBUG oslo_concurrency.lockutils [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:47 compute-0 nova_compute[192810]: 2025-09-30 21:21:47.444 2 DEBUG oslo_concurrency.lockutils [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:47 compute-0 nova_compute[192810]: 2025-09-30 21:21:47.444 2 DEBUG nova.network.neutron [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 kernel: tap0ad4c684-45 (unregistering): left promiscuous mode
Sep 30 21:21:48 compute-0 NetworkManager[51733]: <info>  [1759267308.2021] device (tap0ad4c684-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 ovn_controller[94912]: 2025-09-30T21:21:48Z|00134|binding|INFO|Releasing lport 0ad4c684-4514-4550-b708-6339f933766c from this chassis (sb_readonly=0)
Sep 30 21:21:48 compute-0 ovn_controller[94912]: 2025-09-30T21:21:48Z|00135|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c down in Southbound
Sep 30 21:21:48 compute-0 ovn_controller[94912]: 2025-09-30T21:21:48Z|00136|binding|INFO|Removing iface tap0ad4c684-45 ovn-installed in OVS
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.241 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.245 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.248 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.277 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c64af43b-8355-477f-9d00-1b529c7f7226]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 21:21:48 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001c.scope: Consumed 13.593s CPU time.
Sep 30 21:21:48 compute-0 systemd-machined[152794]: Machine qemu-12-instance-0000001c terminated.
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.319 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb5ed03-d949-440d-8800-415ada102160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.324 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[144d068f-042a-4779-b1b9-99cc89e54e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.358 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9c17bcb5-d07c-4920-ae15-64b3f5f7e9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.388 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c59a15-3b1e-4e60-8eeb-2ec02c4526b3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224864, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.414 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f3ca6007-e66c-47d8-923d-b62eed676c6e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224865, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224865, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.417 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.425 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.426 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.426 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:48.427 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.497 2 DEBUG nova.compute.manager [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.498 2 DEBUG oslo_concurrency.lockutils [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.498 2 DEBUG oslo_concurrency.lockutils [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.498 2 DEBUG oslo_concurrency.lockutils [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.499 2 DEBUG nova.compute.manager [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.499 2 WARNING nova.compute.manager [req-6715e419-8161-4031-b3e8-0f518e39f9c0 req-05c57348-e4de-4f62-ad4e-a8666cd253b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state error and task_state rebuilding.
Sep 30 21:21:48 compute-0 podman[224880]: 2025-09-30 21:21:48.591390872 +0000 UTC m=+0.069141762 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.851 2 DEBUG nova.compute.manager [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.852 2 DEBUG nova.compute.manager [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing instance network info cache due to event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:21:48 compute-0 nova_compute[192810]: 2025-09-30 21:21:48.853 2 DEBUG oslo_concurrency.lockutils [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.026 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance shutdown successfully after 3 seconds.
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.031 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance destroyed successfully.
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.037 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance destroyed successfully.
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.039 2 DEBUG nova.virt.libvirt.vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:44Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.040 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.041 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.042 2 DEBUG os_vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.046 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ad4c684-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.056 2 INFO os_vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.057 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deleting instance files /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.057 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deletion of /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del complete
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.310 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.312 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating image(s)
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.313 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.314 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.314 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.315 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.315 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.731 2 DEBUG nova.network.neutron [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updated VIF entry in instance network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.732 2 DEBUG nova.network.neutron [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.768 2 DEBUG oslo_concurrency.lockutils [req-3bc14c4f-d9e8-479b-91d8-9e684e93c9f7 req-60439c15-2bd5-43ae-9c35-715ef4b1d06c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.769 2 DEBUG oslo_concurrency.lockutils [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:21:49 compute-0 nova_compute[192810]: 2025-09-30 21:21:49.770 2 DEBUG nova.network.neutron [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.817 2 DEBUG nova.compute.manager [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.817 2 DEBUG oslo_concurrency.lockutils [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.818 2 DEBUG oslo_concurrency.lockutils [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.818 2 DEBUG oslo_concurrency.lockutils [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.818 2 DEBUG nova.compute.manager [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:50 compute-0 nova_compute[192810]: 2025-09-30 21:21:50.818 2 WARNING nova.compute.manager [req-3033c0c5-26ca-47f1-b685-0517ce25dc27 req-5d58a737-cc4f-4ae0-949e-150e678b4bef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state error and task_state rebuild_spawning.
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.028 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.099 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.100 2 DEBUG nova.virt.images [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] 29834554-3ec3-4459-bfde-932aa778e979 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.102 2 DEBUG nova.privsep.utils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.102 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.295 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.part /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.301 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.323 2 DEBUG nova.network.neutron [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updated VIF entry in instance network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.324 2 DEBUG nova.network.neutron [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.344 2 DEBUG oslo_concurrency.lockutils [req-bf48e930-29f1-4ae2-bcb6-ab465020cd40 req-8878a04a-2bce-4056-beb6-961554fadd50 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.360 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.361 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.390 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.448 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.451 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.452 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.480 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.547 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.548 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.581 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.583 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.584 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.644 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.645 2 DEBUG nova.virt.disk.api [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Checking if we can resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.646 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.704 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.706 2 DEBUG nova.virt.disk.api [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Cannot resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.706 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.707 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Ensure instance console log exists: /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.707 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.708 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.708 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.711 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start _get_guest_xml network_info=[{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.717 2 WARNING nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.728 2 DEBUG nova.virt.libvirt.host [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.731 2 DEBUG nova.virt.libvirt.host [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.735 2 DEBUG nova.virt.libvirt.host [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.735 2 DEBUG nova.virt.libvirt.host [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.738 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.738 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.738 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.738 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.739 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.739 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.739 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.739 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.739 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.740 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.740 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.740 2 DEBUG nova.virt.hardware [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.740 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'vcpu_model' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.760 2 DEBUG nova.virt.libvirt.vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:49Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.761 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.762 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.763 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <uuid>c79f1121-fcfb-4c07-94cf-1389e1df9e81</uuid>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <name>instance-0000001c</name>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdminTestJSON-server-988999069</nova:name>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:21:51</nova:creationTime>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:user uuid="fdcdab028fdf46d7ba6634c631d3d33c">tempest-ServersAdminTestJSON-302955212-project-member</nova:user>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:project uuid="54fb4b43d65542abbb73044c6d52da8a">tempest-ServersAdminTestJSON-302955212</nova:project>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         <nova:port uuid="0ad4c684-4514-4550-b708-6339f933766c">
Sep 30 21:21:51 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <system>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="serial">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="uuid">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </system>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <os>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </os>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <features>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </features>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:c1:12:42"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <target dev="tap0ad4c684-45"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log" append="off"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <video>
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </video>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:21:51 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:21:51 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:21:51 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:21:51 compute-0 nova_compute[192810]: </domain>
Sep 30 21:21:51 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.764 2 DEBUG nova.compute.manager [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Preparing to wait for external event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.764 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.764 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.765 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.765 2 DEBUG nova.virt.libvirt.vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:49Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='error') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.765 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.766 2 DEBUG nova.network.os_vif_util [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.766 2 DEBUG os_vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.768 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ad4c684-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.771 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ad4c684-45, col_values=(('external_ids', {'iface-id': '0ad4c684-4514-4550-b708-6339f933766c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:12:42', 'vm-uuid': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:51 compute-0 NetworkManager[51733]: <info>  [1759267311.7749] manager: (tap0ad4c684-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.781 2 INFO os_vif [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.836 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.837 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.837 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No VIF found with MAC fa:16:3e:c1:12:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.837 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Using config drive
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.853 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'ec2_ids' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:51 compute-0 nova_compute[192810]: 2025-09-30 21:21:51.947 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'keypairs' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.361 2 INFO nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating config drive at /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.365 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzmq7uf2u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.511 2 DEBUG oslo_concurrency.processutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzmq7uf2u" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:21:52 compute-0 kernel: tap0ad4c684-45: entered promiscuous mode
Sep 30 21:21:52 compute-0 NetworkManager[51733]: <info>  [1759267312.5997] manager: (tap0ad4c684-45): new Tun device (/org/freedesktop/NetworkManager/Devices/67)
Sep 30 21:21:52 compute-0 ovn_controller[94912]: 2025-09-30T21:21:52Z|00137|binding|INFO|Claiming lport 0ad4c684-4514-4550-b708-6339f933766c for this chassis.
Sep 30 21:21:52 compute-0 ovn_controller[94912]: 2025-09-30T21:21:52Z|00138|binding|INFO|0ad4c684-4514-4550-b708-6339f933766c: Claiming fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.615 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.617 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb bound to our chassis
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.618 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:21:52 compute-0 ovn_controller[94912]: 2025-09-30T21:21:52Z|00139|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c ovn-installed in OVS
Sep 30 21:21:52 compute-0 ovn_controller[94912]: 2025-09-30T21:21:52Z|00140|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c up in Southbound
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:52 compute-0 systemd-udevd[224987]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.643 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b39cc2-c42f-44dc-b0d0-18027843b3de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 systemd-machined[152794]: New machine qemu-15-instance-0000001c.
Sep 30 21:21:52 compute-0 NetworkManager[51733]: <info>  [1759267312.6546] device (tap0ad4c684-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:21:52 compute-0 NetworkManager[51733]: <info>  [1759267312.6557] device (tap0ad4c684-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:21:52 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-0000001c.
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.674 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d2296f4b-2dcc-481f-8e26-08a9d40e232f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.678 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9a688e-10b6-46e9-a61d-a60eca552028]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 podman[224963]: 2025-09-30 21:21:52.704665794 +0000 UTC m=+0.112623985 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 21:21:52 compute-0 podman[224964]: 2025-09-30 21:21:52.704988222 +0000 UTC m=+0.111317933 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.715 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[00f095a9-cc43-4633-bf56-e59f355cc210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.735 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[91a0f240-7fb2-4a77-9eff-379b780b5b26]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 1000, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225026, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.751 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0322ee5c-0b90-4d10-8fb9-aa081d9e5c17]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225028, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225028, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.754 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.756 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.756 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.757 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:21:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:21:52.757 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.932 2 DEBUG nova.compute.manager [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.934 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.935 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.935 2 DEBUG oslo_concurrency.lockutils [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:52 compute-0 nova_compute[192810]: 2025-09-30 21:21:52.935 2 DEBUG nova.compute.manager [req-af5bfed2-0d57-4d52-a5d9-f2b0249d4ac6 req-2f6cfd39-d52f-4e2b-97f2-c998a78703fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Processing event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.334 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for c79f1121-fcfb-4c07-94cf-1389e1df9e81 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.335 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267313.3342435, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.335 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Started (Lifecycle Event)
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.337 2 DEBUG nova.compute.manager [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.340 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.343 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance spawned successfully.
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.343 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.371 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.382 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.385 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.385 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.386 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.386 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.386 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.387 2 DEBUG nova.virt.libvirt.driver [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.420 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.423 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267313.3354213, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.423 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Paused (Lifecycle Event)
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.452 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.461 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267313.3401425, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.461 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Resumed (Lifecycle Event)
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.487 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.492 2 DEBUG nova.compute.manager [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.494 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: error, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.551 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.616 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.617 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.618 2 DEBUG nova.objects.instance [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:21:53 compute-0 nova_compute[192810]: 2025-09-30 21:21:53.726 2 DEBUG oslo_concurrency.lockutils [None req-0009b3eb-107c-4695-a039-f1f1f3fbf05d fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:54 compute-0 ovn_controller[94912]: 2025-09-30T21:21:54Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:99:d9:88 10.100.0.4
Sep 30 21:21:54 compute-0 ovn_controller[94912]: 2025-09-30T21:21:54Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:99:d9:88 10.100.0.4
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.025 2 DEBUG nova.compute.manager [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.026 2 DEBUG oslo_concurrency.lockutils [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.026 2 DEBUG oslo_concurrency.lockutils [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.026 2 DEBUG oslo_concurrency.lockutils [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.027 2 DEBUG nova.compute.manager [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:21:55 compute-0 nova_compute[192810]: 2025-09-30 21:21:55.027 2 WARNING nova.compute.manager [req-790ff4e0-56e4-44a7-b953-73b94c69e2f0 req-bcd7880d-7ca6-49c2-9b26-7502d6cafcd8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state active and task_state None.
Sep 30 21:21:56 compute-0 nova_compute[192810]: 2025-09-30 21:21:56.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:56 compute-0 nova_compute[192810]: 2025-09-30 21:21:56.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:58 compute-0 nova_compute[192810]: 2025-09-30 21:21:58.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:21:59 compute-0 podman[225038]: 2025-09-30 21:21:59.342178943 +0000 UTC m=+0.067757278 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm)
Sep 30 21:21:59 compute-0 podman[225037]: 2025-09-30 21:21:59.379730668 +0000 UTC m=+0.108743308 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.411 2 INFO nova.compute.manager [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Rebuilding instance
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.735 2 DEBUG nova.compute.manager [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.812 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_requests' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.824 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'pci_devices' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.839 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'resources' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.875 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'migration_context' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.892 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:21:59 compute-0 nova_compute[192810]: 2025-09-30 21:21:59.895 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:22:01 compute-0 nova_compute[192810]: 2025-09-30 21:22:01.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:01 compute-0 nova_compute[192810]: 2025-09-30 21:22:01.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:02 compute-0 sshd-session[225081]: Invalid user a from 185.156.73.233 port 63806
Sep 30 21:22:02 compute-0 sshd-session[225081]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:22:02 compute-0 sshd-session[225081]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 21:22:02 compute-0 nova_compute[192810]: 2025-09-30 21:22:02.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:03 compute-0 sshd-session[225081]: Failed password for invalid user a from 185.156.73.233 port 63806 ssh2
Sep 30 21:22:04 compute-0 sshd-session[225081]: Connection closed by invalid user a 185.156.73.233 port 63806 [preauth]
Sep 30 21:22:04 compute-0 ovn_controller[94912]: 2025-09-30T21:22:04Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:22:04 compute-0 ovn_controller[94912]: 2025-09-30T21:22:04Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.232 2 DEBUG nova.compute.manager [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.232 2 DEBUG nova.compute.manager [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing instance network info cache due to event network-changed-ce9edf2f-fd59-406b-b9c3-5b314d995fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.233 2 DEBUG oslo_concurrency.lockutils [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.233 2 DEBUG oslo_concurrency.lockutils [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.234 2 DEBUG nova.network.neutron [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:22:05 compute-0 podman[225096]: 2025-09-30 21:22:05.345686548 +0000 UTC m=+0.073761747 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.452 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.452 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:05 compute-0 nova_compute[192810]: 2025-09-30 21:22:05.453 2 DEBUG nova.objects.instance [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'flavor' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:06 compute-0 nova_compute[192810]: 2025-09-30 21:22:06.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:06 compute-0 nova_compute[192810]: 2025-09-30 21:22:06.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:07 compute-0 nova_compute[192810]: 2025-09-30 21:22:07.807 2 DEBUG nova.objects.instance [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'pci_requests' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:07 compute-0 nova_compute[192810]: 2025-09-30 21:22:07.845 2 DEBUG nova.network.neutron [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:22:08 compute-0 nova_compute[192810]: 2025-09-30 21:22:08.203 2 DEBUG nova.policy [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c536ea061a32492a8c5e6bf941d1c9f3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47d2c796445c4dd3affc8594502f04be', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:22:08 compute-0 nova_compute[192810]: 2025-09-30 21:22:08.524 2 DEBUG nova.network.neutron [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updated VIF entry in instance network info cache for port ce9edf2f-fd59-406b-b9c3-5b314d995fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:22:08 compute-0 nova_compute[192810]: 2025-09-30 21:22:08.525 2 DEBUG nova.network.neutron [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:08 compute-0 nova_compute[192810]: 2025-09-30 21:22:08.556 2 DEBUG oslo_concurrency.lockutils [req-84f02c7b-c437-47fb-9f92-7fc1556bc76a req-70c4b2d4-5077-4287-9d60-3f3233cb61c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:09 compute-0 nova_compute[192810]: 2025-09-30 21:22:09.709 2 DEBUG nova.network.neutron [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Successfully updated port: 5008167c-4cca-4768-963e-ab0119180625 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:22:09 compute-0 sshd[128205]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 223679
Sep 30 21:22:09 compute-0 nova_compute[192810]: 2025-09-30 21:22:09.726 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:09 compute-0 nova_compute[192810]: 2025-09-30 21:22:09.727 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:09 compute-0 nova_compute[192810]: 2025-09-30 21:22:09.727 2 DEBUG nova.network.neutron [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:09 compute-0 nova_compute[192810]: 2025-09-30 21:22:09.940 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:22:10 compute-0 nova_compute[192810]: 2025-09-30 21:22:10.286 2 WARNING nova.network.neutron [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] 29d3fdc6-d8e1-4032-8f0c-e91da2912153 already exists in list: networks containing: ['29d3fdc6-d8e1-4032-8f0c-e91da2912153']. ignoring it
Sep 30 21:22:11 compute-0 nova_compute[192810]: 2025-09-30 21:22:11.026 2 DEBUG nova.compute.manager [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-changed-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:11 compute-0 nova_compute[192810]: 2025-09-30 21:22:11.027 2 DEBUG nova.compute.manager [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing instance network info cache due to event network-changed-5008167c-4cca-4768-963e-ab0119180625. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:22:11 compute-0 nova_compute[192810]: 2025-09-30 21:22:11.028 2 DEBUG oslo_concurrency.lockutils [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:11 compute-0 nova_compute[192810]: 2025-09-30 21:22:11.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:11 compute-0 nova_compute[192810]: 2025-09-30 21:22:11.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 kernel: tap0ad4c684-45 (unregistering): left promiscuous mode
Sep 30 21:22:12 compute-0 NetworkManager[51733]: <info>  [1759267332.1668] device (tap0ad4c684-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00141|binding|INFO|Releasing lport 0ad4c684-4514-4550-b708-6339f933766c from this chassis (sb_readonly=0)
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00142|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c down in Southbound
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00143|binding|INFO|Removing iface tap0ad4c684-45 ovn-installed in OVS
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.185 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.187 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.190 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 21:22:12 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Consumed 12.153s CPU time.
Sep 30 21:22:12 compute-0 systemd-machined[152794]: Machine qemu-15-instance-0000001c terminated.
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.212 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fe24d86e-a162-4a63-8307-90505f1b073d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.251 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c35b921a-ad87-4e61-b192-b3e8cfbf9cda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.255 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c31c371c-8446-4855-a783-1c0b2a1987e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 podman[225116]: 2025-09-30 21:22:12.271461085 +0000 UTC m=+0.070266490 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:22:12 compute-0 podman[225119]: 2025-09-30 21:22:12.29092128 +0000 UTC m=+0.089285174 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9)
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.292 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[5d4a6b35-4715-40c4-8bac-51274dd05064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.311 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[148aaf25-a7ef-433f-9637-d23aa3937066]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 1000, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225171, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.318 2 DEBUG nova.network.neutron [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.332 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7bc42c22-8da5-4c86-8c37-33b38e26b8ee]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225172, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225172, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.333 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.382 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.384 2 DEBUG oslo_concurrency.lockutils [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.384 2 DEBUG nova.network.neutron [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Refreshing network info cache for port 5008167c-4cca-4768-963e-ab0119180625 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.387 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.388 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.388 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.389 2 DEBUG nova.virt.libvirt.vif [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.389 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.389 2 DEBUG nova.network.os_vif_util [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.390 2 DEBUG nova.network.os_vif_util [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.390 2 DEBUG os_vif [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.394 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5008167c-4c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.395 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5008167c-4c, col_values=(('external_ids', {'iface-id': '5008167c-4cca-4768-963e-ab0119180625', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3c:a2:3f', 'vm-uuid': '370a976e-b51f-4187-bf76-bf6cbcae956b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 NetworkManager[51733]: <info>  [1759267332.3970] manager: (tap5008167c-4c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.415 2 INFO os_vif [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c')
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.416 2 DEBUG nova.virt.libvirt.vif [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.416 2 DEBUG nova.network.os_vif_util [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.416 2 DEBUG nova.network.os_vif_util [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.421 2 DEBUG nova.virt.libvirt.guest [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] attach device xml: <interface type="ethernet">
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <target dev="tap5008167c-4c"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]: </interface>
Sep 30 21:22:12 compute-0 nova_compute[192810]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Sep 30 21:22:12 compute-0 systemd-udevd[225133]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:22:12 compute-0 NetworkManager[51733]: <info>  [1759267332.4382] manager: (tap5008167c-4c): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Sep 30 21:22:12 compute-0 kernel: tap5008167c-4c: entered promiscuous mode
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00144|binding|INFO|Claiming lport 5008167c-4cca-4768-963e-ab0119180625 for this chassis.
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00145|binding|INFO|5008167c-4cca-4768-963e-ab0119180625: Claiming fa:16:3e:3c:a2:3f 10.100.0.11
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 NetworkManager[51733]: <info>  [1759267332.4506] device (tap5008167c-4c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.450 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:a2:3f 10.100.0.11'], port_security=['fa:16:3e:3c:a2:3f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '370a976e-b51f-4187-bf76-bf6cbcae956b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0583c73e-88bf-4029-98b4-7475adfa8c7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=5008167c-4cca-4768-963e-ab0119180625) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.451 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 5008167c-4cca-4768-963e-ab0119180625 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 bound to our chassis
Sep 30 21:22:12 compute-0 NetworkManager[51733]: <info>  [1759267332.4526] device (tap5008167c-4c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.452 2 DEBUG nova.compute.manager [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.452 2 DEBUG oslo_concurrency.lockutils [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.452 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.452 2 DEBUG oslo_concurrency.lockutils [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.453 2 DEBUG oslo_concurrency.lockutils [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.453 2 DEBUG nova.compute.manager [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.453 2 WARNING nova.compute.manager [req-9715f533-9bd6-43c3-83d5-abba1c3c1192 req-590aeeac-3722-4805-8b12-e47b161b28ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state active and task_state rebuilding.
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.471 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe1ea82-339a-4c7a-9538-7661f6f236c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00146|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 ovn-installed in OVS
Sep 30 21:22:12 compute-0 ovn_controller[94912]: 2025-09-30T21:22:12Z|00147|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 up in Southbound
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.505 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c4896153-2542-46fe-a9b1-2ec1764e9259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.508 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e26aced0-32b9-417f-b287-b7b6a32d26e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.524 2 DEBUG nova.virt.libvirt.driver [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.524 2 DEBUG nova.virt.libvirt.driver [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.524 2 DEBUG nova.virt.libvirt.driver [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:99:d9:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.524 2 DEBUG nova.virt.libvirt.driver [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] No VIF found with MAC fa:16:3e:3c:a2:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.541 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2baecdbf-3e4c-4850-b81d-2cc1665895ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.550 2 DEBUG nova.virt.libvirt.guest [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:name>tempest-tempest.common.compute-instance-1609111197</nova:name>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:creationTime>2025-09-30 21:22:12</nova:creationTime>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:flavor name="m1.nano">
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:memory>128</nova:memory>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:disk>1</nova:disk>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:swap>0</nova:swap>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   </nova:flavor>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:owner>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   </nova:owner>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   <nova:ports>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:port uuid="ce9edf2f-fd59-406b-b9c3-5b314d995fe9">
Sep 30 21:22:12 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:22:12 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:22:12 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:12 compute-0 nova_compute[192810]:   </nova:ports>
Sep 30 21:22:12 compute-0 nova_compute[192810]: </nova:instance>
Sep 30 21:22:12 compute-0 nova_compute[192810]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.563 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6818bb-3bef-412a-a663-a1d525c10f64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399635, 'reachable_time': 15623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225207, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.579 2 DEBUG oslo_concurrency.lockutils [None req-7f733ef8-4e7c-4368-b0cb-01d434a21d42 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.579 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6e032488-b635-41b5-9d8d-6dee6baec83c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399648, 'tstamp': 399648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225208, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399651, 'tstamp': 399651}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225208, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.580 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.586 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.586 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:12.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.957 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance shutdown successfully after 13 seconds.
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.965 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance destroyed successfully.
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.972 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance destroyed successfully.
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.974 2 DEBUG nova.virt.libvirt.vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:21:58Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.974 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.976 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.977 2 DEBUG os_vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ad4c684-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.986 2 DEBUG nova.compute.manager [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.986 2 DEBUG oslo_concurrency.lockutils [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.987 2 DEBUG oslo_concurrency.lockutils [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.987 2 DEBUG oslo_concurrency.lockutils [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.987 2 DEBUG nova.compute.manager [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.988 2 WARNING nova.compute.manager [req-4aba55fe-64da-4120-b822-89ac1ce8e05c req-0363430f-4644-46f7-8f99-f1422ae70ac7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.994 2 INFO os_vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.995 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deleting instance files /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del
Sep 30 21:22:12 compute-0 nova_compute[192810]: 2025-09-30 21:22:12.997 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deletion of /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del complete
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.191 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.192 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating image(s)
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.192 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.192 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.193 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.207 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.296 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.297 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.298 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.309 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.369 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.371 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.418 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.419 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.419 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.471 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.472 2 DEBUG nova.virt.disk.api [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Checking if we can resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.473 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.543 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.543 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.546 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.546 2 DEBUG nova.virt.disk.api [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Cannot resize image /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.547 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.547 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Ensure instance console log exists: /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.547 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.547 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.548 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.549 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start _get_guest_xml network_info=[{"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.553 2 WARNING nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.559 2 DEBUG nova.objects.instance [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'flavor' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.561 2 DEBUG nova.virt.libvirt.host [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.561 2 DEBUG nova.virt.libvirt.host [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.564 2 DEBUG nova.virt.libvirt.host [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.564 2 DEBUG nova.virt.libvirt.host [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.565 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.565 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.566 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.566 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.566 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.566 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.566 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.virt.hardware [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.567 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'vcpu_model' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.591 2 DEBUG nova.virt.libvirt.vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:22:13Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.592 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.592 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.593 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <uuid>c79f1121-fcfb-4c07-94cf-1389e1df9e81</uuid>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <name>instance-0000001c</name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdminTestJSON-server-988999069</nova:name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:22:13</nova:creationTime>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:user uuid="fdcdab028fdf46d7ba6634c631d3d33c">tempest-ServersAdminTestJSON-302955212-project-member</nova:user>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:project uuid="54fb4b43d65542abbb73044c6d52da8a">tempest-ServersAdminTestJSON-302955212</nova:project>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <nova:port uuid="0ad4c684-4514-4550-b708-6339f933766c">
Sep 30 21:22:13 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="serial">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="uuid">c79f1121-fcfb-4c07-94cf-1389e1df9e81</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:c1:12:42"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev="tap0ad4c684-45"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/console.log" append="off"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </domain>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.594 2 DEBUG nova.virt.libvirt.vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:22:13Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.594 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.594 2 DEBUG nova.network.os_vif_util [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.595 2 DEBUG os_vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.596 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.596 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.599 2 DEBUG nova.virt.libvirt.vif [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.599 2 DEBUG nova.network.os_vif_util [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.599 2 DEBUG nova.network.os_vif_util [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.601 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ad4c684-45, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.602 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ad4c684-45, col_values=(('external_ids', {'iface-id': '0ad4c684-4514-4550-b708-6339f933766c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:12:42', 'vm-uuid': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.603 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:22:13 compute-0 NetworkManager[51733]: <info>  [1759267333.6367] manager: (tap0ad4c684-45): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.638 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.640 2 DEBUG nova.virt.libvirt.driver [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Attempting to detach device tap5008167c-4c from instance 370a976e-b51f-4187-bf76-bf6cbcae956b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.641 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] detach device xml: <interface type="ethernet">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <target dev="tap5008167c-4c"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.644 2 INFO os_vif [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.646 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.646 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.647 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.651 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface>not found in domain: <domain type='kvm' id='14'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <name>instance-00000022</name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <uuid>370a976e-b51f-4187-bf76-bf6cbcae956b</uuid>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:name>tempest-tempest.common.compute-instance-1609111197</nova:name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:creationTime>2025-09-30 21:22:12</nova:creationTime>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:flavor name="m1.nano">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:memory>128</nova:memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:disk>1</nova:disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:swap>0</nova:swap>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:flavor>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:port uuid="ce9edf2f-fd59-406b-b9c3-5b314d995fe9">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </nova:instance>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <memory unit='KiB'>131072</memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <currentMemory unit='KiB'>131072</currentMemory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <vcpu placement='static'>1</vcpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <resource>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <partition>/machine</partition>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </resource>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <sysinfo type='smbios'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='manufacturer'>RDO</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='product'>OpenStack Compute</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='serial'>370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='uuid'>370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='family'>Virtual Machine</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <boot dev='hd'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <smbios mode='sysinfo'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <vmcoreinfo state='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <cpu mode='custom' match='exact' check='full'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <model fallback='forbid'>Nehalem</model>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='x2apic'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='hypervisor'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='vme'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <clock offset='utc'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='pit' tickpolicy='delay'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='rtc' tickpolicy='catchup'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='hpet' present='no'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_poweroff>destroy</on_poweroff>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_reboot>restart</on_reboot>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_crash>destroy</on_crash>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type='file' device='disk'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='qemu' type='qcow2' cache='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk' index='2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backingStore type='file' index='3'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <format type='raw'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <source file='/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <backingStore/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </backingStore>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='vda' bus='virtio'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='virtio-disk0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type='file' device='cdrom'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='qemu' type='raw' cache='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config' index='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backingStore/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='sda' bus='sata'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <readonly/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='sata0-0-0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='0' model='pcie-root'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pcie.0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='1' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='1' port='0x10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='2' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='2' port='0x11'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='3' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='3' port='0x12'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='4' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='4' port='0x13'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='5' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='5' port='0x14'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='6' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='6' port='0x15'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='7' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='7' port='0x16'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='8' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='8' port='0x17'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.8'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='9' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='9' port='0x18'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.9'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='10' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='10' port='0x19'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='11' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='11' port='0x1a'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.11'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='12' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='12' port='0x1b'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.12'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='13' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='13' port='0x1c'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.13'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='14' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='14' port='0x1d'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.14'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='15' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='15' port='0x1e'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.15'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='16' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='16' port='0x1f'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.16'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='17' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='17' port='0x20'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.17'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='18' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='18' port='0x21'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.18'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='19' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='19' port='0x22'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.19'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='20' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='20' port='0x23'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.20'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='21' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='21' port='0x24'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.21'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='22' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='22' port='0x25'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.22'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='23' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='23' port='0x26'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.23'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='24' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='24' port='0x27'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.24'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='25' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='25' port='0x28'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.25'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-pci-bridge'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.26'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='usb' index='0' model='piix3-uhci'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='usb'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='sata' index='0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='ide'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <interface type='ethernet'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mac address='fa:16:3e:99:d9:88'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='tapce9edf2f-fd'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type='virtio'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mtu size='1442'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='net0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <interface type='ethernet'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mac address='fa:16:3e:3c:a2:3f'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='tap5008167c-4c'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type='virtio'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mtu size='1442'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='net1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <serial type='pty'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source path='/dev/pts/2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <log file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log' append='off'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target type='isa-serial' port='0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <model name='isa-serial'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </target>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='serial0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <console type='pty' tty='/dev/pts/2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source path='/dev/pts/2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <log file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log' append='off'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target type='serial' port='0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='serial0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </console>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='tablet' bus='usb'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='usb' bus='0' port='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='mouse' bus='ps2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='keyboard' bus='ps2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <graphics type='vnc' port='5902' autoport='yes' listen='::0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <listen type='address' address='::0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <audio id='1' type='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type='virtio' heads='1' primary='yes'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='video0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <watchdog model='itco' action='reset'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='watchdog0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </watchdog>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <memballoon model='virtio'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <stats period='10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='balloon0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <rng model='virtio'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backend model='random'>/dev/urandom</backend>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='rng0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <label>system_u:system_r:svirt_t:s0:c2,c274</label>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c2,c274</imagelabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </seclabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <label>+107:+107</label>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <imagelabel>+107:+107</imagelabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </seclabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </domain>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.652 2 INFO nova.virt.libvirt.driver [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully detached device tap5008167c-4c from instance 370a976e-b51f-4187-bf76-bf6cbcae956b from the persistent domain config.
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.652 2 DEBUG nova.virt.libvirt.driver [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] (1/8): Attempting to detach device tap5008167c-4c with device alias net1 from instance 370a976e-b51f-4187-bf76-bf6cbcae956b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.652 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] detach device xml: <interface type="ethernet">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:3c:a2:3f"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <target dev="tap5008167c-4c"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Sep 30 21:22:13 compute-0 kernel: tap5008167c-4c (unregistering): left promiscuous mode
Sep 30 21:22:13 compute-0 NetworkManager[51733]: <info>  [1759267333.7163] device (tap5008167c-4c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.719 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.719 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.720 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] No VIF found with MAC fa:16:3e:c1:12:42, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.720 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Using config drive
Sep 30 21:22:13 compute-0 ovn_controller[94912]: 2025-09-30T21:22:13Z|00148|binding|INFO|Releasing lport 5008167c-4cca-4768-963e-ab0119180625 from this chassis (sb_readonly=0)
Sep 30 21:22:13 compute-0 ovn_controller[94912]: 2025-09-30T21:22:13Z|00149|binding|INFO|Setting lport 5008167c-4cca-4768-963e-ab0119180625 down in Southbound
Sep 30 21:22:13 compute-0 ovn_controller[94912]: 2025-09-30T21:22:13Z|00150|binding|INFO|Removing iface tap5008167c-4c ovn-installed in OVS
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.730 2 DEBUG nova.virt.libvirt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Received event <DeviceRemovedEvent: 1759267333.7299118, 370a976e-b51f-4187-bf76-bf6cbcae956b => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.731 2 DEBUG nova.virt.libvirt.driver [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Start waiting for the detach event from libvirt for device tap5008167c-4c with device alias net1 for instance 370a976e-b51f-4187-bf76-bf6cbcae956b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.732 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.732 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:a2:3f 10.100.0.11'], port_security=['fa:16:3e:3c:a2:3f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '370a976e-b51f-4187-bf76-bf6cbcae956b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1560063103', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0583c73e-88bf-4029-98b4-7475adfa8c7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=5008167c-4cca-4768-963e-ab0119180625) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.733 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 5008167c-4cca-4768-963e-ab0119180625 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 unbound from our chassis
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.735 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3c:a2:3f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap5008167c-4c"/></interface>not found in domain: <domain type='kvm' id='14'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <name>instance-00000022</name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <uuid>370a976e-b51f-4187-bf76-bf6cbcae956b</uuid>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:name>tempest-tempest.common.compute-instance-1609111197</nova:name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:creationTime>2025-09-30 21:22:12</nova:creationTime>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:flavor name="m1.nano">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:memory>128</nova:memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:disk>1</nova:disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:swap>0</nova:swap>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:flavor>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:port uuid="ce9edf2f-fd59-406b-b9c3-5b314d995fe9">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:port uuid="5008167c-4cca-4768-963e-ab0119180625">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </nova:instance>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <memory unit='KiB'>131072</memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <currentMemory unit='KiB'>131072</currentMemory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <vcpu placement='static'>1</vcpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <resource>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <partition>/machine</partition>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </resource>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <sysinfo type='smbios'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='manufacturer'>RDO</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='product'>OpenStack Compute</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='serial'>370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='uuid'>370a976e-b51f-4187-bf76-bf6cbcae956b</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <entry name='family'>Virtual Machine</entry>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </system>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <boot dev='hd'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <smbios mode='sysinfo'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </os>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <vmcoreinfo state='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </features>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <cpu mode='custom' match='exact' check='full'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <model fallback='forbid'>Nehalem</model>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='x2apic'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='hypervisor'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <feature policy='require' name='vme'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <clock offset='utc'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='pit' tickpolicy='delay'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='rtc' tickpolicy='catchup'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <timer name='hpet' present='no'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_poweroff>destroy</on_poweroff>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_reboot>restart</on_reboot>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <on_crash>destroy</on_crash>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type='file' device='disk'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='qemu' type='qcow2' cache='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk' index='2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backingStore type='file' index='3'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <format type='raw'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <source file='/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <backingStore/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </backingStore>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='vda' bus='virtio'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='virtio-disk0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.735 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <disk type='file' device='cdrom'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='qemu' type='raw' cache='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/disk.config' index='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backingStore/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='sda' bus='sata'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <readonly/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='sata0-0-0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='0' model='pcie-root'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pcie.0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='1' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='1' port='0x10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='2' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='2' port='0x11'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='3' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='3' port='0x12'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='4' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='4' port='0x13'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='5' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='5' port='0x14'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='6' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='6' port='0x15'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='7' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='7' port='0x16'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='8' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='8' port='0x17'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.8'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='9' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='9' port='0x18'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.9'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='10' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='10' port='0x19'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='11' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='11' port='0x1a'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.11'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='12' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='12' port='0x1b'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.12'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='13' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='13' port='0x1c'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.13'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='14' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='14' port='0x1d'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.14'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='15' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='15' port='0x1e'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.15'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='16' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='16' port='0x1f'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.16'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='17' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='17' port='0x20'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.17'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='18' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='18' port='0x21'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.18'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='19' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='19' port='0x22'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.19'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='20' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='20' port='0x23'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.20'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='21' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='21' port='0x24'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.21'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='22' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='22' port='0x25'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.22'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='23' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='23' port='0x26'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.23'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='24' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='24' port='0x27'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.24'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='25' model='pcie-root-port'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-root-port'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target chassis='25' port='0x28'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.25'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model name='pcie-pci-bridge'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='pci.26'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='usb' index='0' model='piix3-uhci'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='usb'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <controller type='sata' index='0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='ide'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </controller>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <interface type='ethernet'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mac address='fa:16:3e:99:d9:88'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target dev='tapce9edf2f-fd'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type='virtio'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <driver name='vhost' rx_queue_size='512'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <mtu size='1442'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='net0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <serial type='pty'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source path='/dev/pts/2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <log file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log' append='off'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target type='isa-serial' port='0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:         <model name='isa-serial'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       </target>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='serial0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <console type='pty' tty='/dev/pts/2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <source path='/dev/pts/2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <log file='/var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b/console.log' append='off'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <target type='serial' port='0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='serial0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </console>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='tablet' bus='usb'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='usb' bus='0' port='1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='mouse' bus='ps2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input1'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <input type='keyboard' bus='ps2'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='input2'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </input>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <graphics type='vnc' port='5902' autoport='yes' listen='::0'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <listen type='address' address='::0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </graphics>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <audio id='1' type='none'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <model type='virtio' heads='1' primary='yes'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='video0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </video>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <watchdog model='itco' action='reset'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='watchdog0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </watchdog>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <memballoon model='virtio'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <stats period='10'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='balloon0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <rng model='virtio'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <backend model='random'>/dev/urandom</backend>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <alias name='rng0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <label>system_u:system_r:svirt_t:s0:c2,c274</label>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c2,c274</imagelabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </seclabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <label>+107:+107</label>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <imagelabel>+107:+107</imagelabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </seclabel>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </domain>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.735 2 INFO nova.virt.libvirt.driver [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully detached device tap5008167c-4c from instance 370a976e-b51f-4187-bf76-bf6cbcae956b from the live domain config.
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.736 2 DEBUG nova.virt.libvirt.vif [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.736 2 DEBUG nova.network.os_vif_util [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.737 2 DEBUG nova.network.os_vif_util [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.737 2 DEBUG os_vif [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5008167c-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.743 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'ec2_ids' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.750 2 INFO os_vif [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c')
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.751 2 DEBUG nova.virt.libvirt.guest [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:name>tempest-tempest.common.compute-instance-1609111197</nova:name>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:creationTime>2025-09-30 21:22:13</nova:creationTime>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:flavor name="m1.nano">
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:memory>128</nova:memory>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:disk>1</nova:disk>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:swap>0</nova:swap>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:flavor>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:user uuid="c536ea061a32492a8c5e6bf941d1c9f3">tempest-AttachInterfacesTestJSON-32534463-project-member</nova:user>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:project uuid="47d2c796445c4dd3affc8594502f04be">tempest-AttachInterfacesTestJSON-32534463</nova:project>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:owner>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   <nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     <nova:port uuid="ce9edf2f-fd59-406b-b9c3-5b314d995fe9">
Sep 30 21:22:13 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:22:13 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:22:13 compute-0 nova_compute[192810]:   </nova:ports>
Sep 30 21:22:13 compute-0 nova_compute[192810]: </nova:instance>
Sep 30 21:22:13 compute-0 nova_compute[192810]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.751 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[08eab975-9e25-44a2-b724-65e84548c1c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.780 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d53ecc31-1814-49ea-a2d6-6d0edaae9a42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.783 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6f373ef5-078c-418e-bc4e-021aa334a3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.790 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'keypairs' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.806 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[65efb14c-2436-4ae1-9d4c-c9b35dd380b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.824 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b8dcf4bb-dcd1-469f-8184-88229f33351f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap29d3fdc6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:10:f7:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399635, 'reachable_time': 15623, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225240, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.845 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b42a6a64-c890-471f-9381-49918f638eee]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399648, 'tstamp': 399648}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225241, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap29d3fdc6-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399651, 'tstamp': 399651}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225241, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.847 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 nova_compute[192810]: 2025-09-30 21:22:13.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.853 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap29d3fdc6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.853 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.854 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap29d3fdc6-d0, col_values=(('external_ids', {'iface-id': 'ad63e4cf-251e-40e7-aea0-9713eaa58a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:13.854 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.324 2 INFO nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Creating config drive at /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.329 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpymrj8gyg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.350 2 DEBUG nova.network.neutron [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updated VIF entry in instance network info cache for port 5008167c-4cca-4768-963e-ab0119180625. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.351 2 DEBUG nova.network.neutron [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.367 2 DEBUG oslo_concurrency.lockutils [req-549d3cda-85b6-4c3e-9538-942f32b63b47 req-1661594e-8b03-4895-b8d9-f6ec0ff12ed5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.457 2 DEBUG oslo_concurrency.processutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpymrj8gyg" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:14 compute-0 kernel: tap0ad4c684-45: entered promiscuous mode
Sep 30 21:22:14 compute-0 NetworkManager[51733]: <info>  [1759267334.5294] manager: (tap0ad4c684-45): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:14 compute-0 ovn_controller[94912]: 2025-09-30T21:22:14Z|00151|binding|INFO|Claiming lport 0ad4c684-4514-4550-b708-6339f933766c for this chassis.
Sep 30 21:22:14 compute-0 ovn_controller[94912]: 2025-09-30T21:22:14Z|00152|binding|INFO|0ad4c684-4514-4550-b708-6339f933766c: Claiming fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.539 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.541 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb bound to our chassis
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.544 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:22:14 compute-0 NetworkManager[51733]: <info>  [1759267334.5490] device (tap0ad4c684-45): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:22:14 compute-0 NetworkManager[51733]: <info>  [1759267334.5507] device (tap0ad4c684-45): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:22:14 compute-0 ovn_controller[94912]: 2025-09-30T21:22:14Z|00153|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c up in Southbound
Sep 30 21:22:14 compute-0 ovn_controller[94912]: 2025-09-30T21:22:14Z|00154|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c ovn-installed in OVS
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.569 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fce88413-a812-4519-bfed-00fedbc83bb0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 systemd-machined[152794]: New machine qemu-16-instance-0000001c.
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.589 2 DEBUG nova.compute.manager [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.589 2 DEBUG oslo_concurrency.lockutils [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.590 2 DEBUG oslo_concurrency.lockutils [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.590 2 DEBUG oslo_concurrency.lockutils [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.590 2 DEBUG nova.compute.manager [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.590 2 WARNING nova.compute.manager [req-f40cb409-bd8c-4b8c-9398-0ac3ad352928 req-3b51475c-1c75-48d7-82c9-8b6e3634c139 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state active and task_state rebuild_spawning.
Sep 30 21:22:14 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-0000001c.
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.619 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8702a0-1727-40d4-94a5-46e7ea19e77a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.626 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf70518-982a-4756-a30f-477f36fd9b9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.670 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[66c94e67-dc72-4e56-9674-4349afa19372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.702 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c7b56a-0b85-4d24-bea4-337a329e3de7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 1000, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 13, 'rx_bytes': 1000, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225272, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.733 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eeedd3a6-66a9-4281-aa70-f4f8c64088ab]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225273, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225273, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.735 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:14 compute-0 nova_compute[192810]: 2025-09-30 21:22:14.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.739 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.740 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.740 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:14.741 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.128 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.128 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 WARNING nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.129 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 WARNING nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-unplugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.130 2 DEBUG oslo_concurrency.lockutils [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.131 2 DEBUG nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.131 2 WARNING nova.compute.manager [req-96205a04-e27c-45de-ab84-c487f4071d62 req-7cb398fa-df9a-4742-ad15-27057f3e312c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-plugged-5008167c-4cca-4768-963e-ab0119180625 for instance with vm_state active and task_state None.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.328 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.328 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquired lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.329 2 DEBUG nova.network.neutron [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.619 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for c79f1121-fcfb-4c07-94cf-1389e1df9e81 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.619 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267335.6184638, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.619 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Resumed (Lifecycle Event)
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.621 2 DEBUG nova.compute.manager [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.622 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.625 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance spawned successfully.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.625 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.651 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.656 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.660 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.660 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.661 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.661 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.662 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.662 2 DEBUG nova.virt.libvirt.driver [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.693 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.694 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267335.618619, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.694 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Started (Lifecycle Event)
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.726 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.729 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.751 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.754 2 DEBUG nova.compute.manager [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.837 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.838 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.838 2 DEBUG nova.objects.instance [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:22:15 compute-0 nova_compute[192810]: 2025-09-30 21:22:15.948 2 DEBUG oslo_concurrency.lockutils [None req-d5ca9e31-c268-44be-8c88-b5de77e7c9cc fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.210 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.211 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.211 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.212 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.212 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.226 2 INFO nova.compute.manager [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Terminating instance
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.245 2 DEBUG nova.compute.manager [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:16 compute-0 kernel: tapce9edf2f-fd (unregistering): left promiscuous mode
Sep 30 21:22:16 compute-0 NetworkManager[51733]: <info>  [1759267336.2648] device (tapce9edf2f-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:16 compute-0 ovn_controller[94912]: 2025-09-30T21:22:16Z|00155|binding|INFO|Releasing lport ce9edf2f-fd59-406b-b9c3-5b314d995fe9 from this chassis (sb_readonly=0)
Sep 30 21:22:16 compute-0 ovn_controller[94912]: 2025-09-30T21:22:16Z|00156|binding|INFO|Setting lport ce9edf2f-fd59-406b-b9c3-5b314d995fe9 down in Southbound
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 ovn_controller[94912]: 2025-09-30T21:22:16Z|00157|binding|INFO|Removing iface tapce9edf2f-fd ovn-installed in OVS
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.281 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:d9:88 10.100.0.4'], port_security=['fa:16:3e:99:d9:88 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '370a976e-b51f-4187-bf76-bf6cbcae956b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '47d2c796445c4dd3affc8594502f04be', 'neutron:revision_number': '4', 'neutron:security_group_ids': '69ef8d59-f66f-4bdf-9235-591ecebd585e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.223'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f6d308b-b549-4733-b51f-a3dd42be0f08, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ce9edf2f-fd59-406b-b9c3-5b314d995fe9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.283 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ce9edf2f-fd59-406b-b9c3-5b314d995fe9 in datapath 29d3fdc6-d8e1-4032-8f0c-e91da2912153 unbound from our chassis
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.286 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 29d3fdc6-d8e1-4032-8f0c-e91da2912153, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.287 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[72547e7d-6c6b-4a56-bfef-4e676ba877bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.294 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 namespace which is not needed anymore
Sep 30 21:22:16 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Deactivated successfully.
Sep 30 21:22:16 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000022.scope: Consumed 13.155s CPU time.
Sep 30 21:22:16 compute-0 systemd-machined[152794]: Machine qemu-14-instance-00000022 terminated.
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [NOTICE]   (224836) : haproxy version is 2.8.14-c23fe91
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [NOTICE]   (224836) : path to executable is /usr/sbin/haproxy
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [WARNING]  (224836) : Exiting Master process...
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [WARNING]  (224836) : Exiting Master process...
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [ALERT]    (224836) : Current worker (224838) exited with code 143 (Terminated)
Sep 30 21:22:16 compute-0 neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153[224829]: [WARNING]  (224836) : All workers exited. Exiting... (0)
Sep 30 21:22:16 compute-0 systemd[1]: libpod-69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172.scope: Deactivated successfully.
Sep 30 21:22:16 compute-0 conmon[224829]: conmon 69cf564ca81ae2577c91 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172.scope/container/memory.events
Sep 30 21:22:16 compute-0 podman[225305]: 2025-09-30 21:22:16.423785368 +0000 UTC m=+0.043778401 container died 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:22:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172-userdata-shm.mount: Deactivated successfully.
Sep 30 21:22:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-c5b1b59b26a28491a52d35aee35638e3df8b2a50f12eed8a0e32f3b8650d6797-merged.mount: Deactivated successfully.
Sep 30 21:22:16 compute-0 podman[225305]: 2025-09-30 21:22:16.464729747 +0000 UTC m=+0.084722760 container cleanup 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:22:16 compute-0 NetworkManager[51733]: <info>  [1759267336.4664] manager: (tapce9edf2f-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Sep 30 21:22:16 compute-0 systemd[1]: libpod-conmon-69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172.scope: Deactivated successfully.
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.515 2 INFO nova.virt.libvirt.driver [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Instance destroyed successfully.
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.516 2 DEBUG nova.objects.instance [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lazy-loading 'resources' on Instance uuid 370a976e-b51f-4187-bf76-bf6cbcae956b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.528 2 DEBUG nova.virt.libvirt.vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.529 2 DEBUG nova.network.os_vif_util [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.529 2 DEBUG nova.network.os_vif_util [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.530 2 DEBUG os_vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.531 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce9edf2f-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.535 2 INFO os_vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:99:d9:88,bridge_name='br-int',has_traffic_filtering=True,id=ce9edf2f-fd59-406b-b9c3-5b314d995fe9,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapce9edf2f-fd')
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.536 2 DEBUG nova.virt.libvirt.vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-1609111197',display_name='tempest-tempest.common.compute-instance-1609111197',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-1609111197',id=34,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAiqd8zlUv0daoHrSM4f0FkhoHklAhet2FPUnE56/Ac7ATsAijkYKDWRYPhtHLrbJjDveTvHop3CVY09bPDxSILijQRoZQfSPrdRSYWqRSb8fAb7+uxFNn+ITDg2wp4sFw==',key_name='tempest-keypair-804840075',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='47d2c796445c4dd3affc8594502f04be',ramdisk_id='',reservation_id='r-9ynmfskw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-32534463',owner_user_name='tempest-AttachInterfacesTestJSON-32534463-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c536ea061a32492a8c5e6bf941d1c9f3',uuid=370a976e-b51f-4187-bf76-bf6cbcae956b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.536 2 DEBUG nova.network.os_vif_util [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converting VIF {"id": "5008167c-4cca-4768-963e-ab0119180625", "address": "fa:16:3e:3c:a2:3f", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5008167c-4c", "ovs_interfaceid": "5008167c-4cca-4768-963e-ab0119180625", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.537 2 DEBUG nova.network.os_vif_util [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.537 2 DEBUG os_vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:16 compute-0 podman[225338]: 2025-09-30 21:22:16.537995792 +0000 UTC m=+0.050876808 container remove 69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5008167c-4c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.540 2 INFO os_vif [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3c:a2:3f,bridge_name='br-int',has_traffic_filtering=True,id=5008167c-4cca-4768-963e-ab0119180625,network=Network(29d3fdc6-d8e1-4032-8f0c-e91da2912153),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap5008167c-4c')
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.541 2 INFO nova.virt.libvirt.driver [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Deleting instance files /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b_del
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.541 2 INFO nova.virt.libvirt.driver [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Deletion of /var/lib/nova/instances/370a976e-b51f-4187-bf76-bf6cbcae956b_del complete
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.542 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb3788b-40ac-4f27-a294-79880dc65038]: (4, ('Tue Sep 30 09:22:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 (69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172)\n69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172\nTue Sep 30 09:22:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 (69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172)\n69cf564ca81ae2577c9150bda73f69d611fcd7ae2485e7be7e34c922f28ac172\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.544 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[24fbd7a9-a7eb-4f99-800b-99fee905b1c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.545 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap29d3fdc6-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 kernel: tap29d3fdc6-d0: left promiscuous mode
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.560 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f74ec2f9-44ec-47a2-93c1-d1f822aadde5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.586 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[51637d43-156c-4de3-9ea9-9e26d0b502c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.587 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[177ae7b2-8042-45ed-af8a-970d71cde4ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.602 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a66321ed-0599-4284-a452-63081ac6fb48]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399626, 'reachable_time': 43034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225365, 'error': None, 'target': 'ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 systemd[1]: run-netns-ovnmeta\x2d29d3fdc6\x2dd8e1\x2d4032\x2d8f0c\x2de91da2912153.mount: Deactivated successfully.
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.608 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-29d3fdc6-d8e1-4032-8f0c-e91da2912153 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:22:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:16.609 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[dacdb57d-095a-4383-ba4c-8187c214f956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.622 2 INFO nova.compute.manager [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.622 2 DEBUG oslo.service.loopingcall [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.622 2 DEBUG nova.compute.manager [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.623 2 DEBUG nova.network.neutron [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.718 2 DEBUG nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.718 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.719 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.719 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.720 2 DEBUG nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.720 2 WARNING nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state error and task_state None.
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.720 2 DEBUG nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.721 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.721 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.721 2 DEBUG oslo_concurrency.lockutils [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.728 2 DEBUG nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.729 2 WARNING nova.compute.manager [req-a209f4a9-554f-4f08-873e-a6fcfb821b5d req-7b4dd65a-828e-4da5-be23-707e7975b3c7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state error and task_state None.
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.739 2 DEBUG nova.compute.manager [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-unplugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.739 2 DEBUG oslo_concurrency.lockutils [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.739 2 DEBUG oslo_concurrency.lockutils [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.740 2 DEBUG oslo_concurrency.lockutils [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.740 2 DEBUG nova.compute.manager [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-unplugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:16 compute-0 nova_compute[192810]: 2025-09-30 21:22:16.740 2 DEBUG nova.compute.manager [req-4b68dae4-af7b-47cc-b402-93a01f5d48c0 req-6aff5140-ca58-49fa-a98c-a20197623395 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-unplugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:22:17 compute-0 nova_compute[192810]: 2025-09-30 21:22:17.046 2 INFO nova.network.neutron [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Port 5008167c-4cca-4768-963e-ab0119180625 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Sep 30 21:22:17 compute-0 nova_compute[192810]: 2025-09-30 21:22:17.046 2 DEBUG nova.network.neutron [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [{"id": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "address": "fa:16:3e:99:d9:88", "network": {"id": "29d3fdc6-d8e1-4032-8f0c-e91da2912153", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1871747400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "47d2c796445c4dd3affc8594502f04be", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapce9edf2f-fd", "ovs_interfaceid": "ce9edf2f-fd59-406b-b9c3-5b314d995fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:17 compute-0 nova_compute[192810]: 2025-09-30 21:22:17.075 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Releasing lock "refresh_cache-370a976e-b51f-4187-bf76-bf6cbcae956b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:17 compute-0 nova_compute[192810]: 2025-09-30 21:22:17.109 2 DEBUG oslo_concurrency.lockutils [None req-1fb2abe7-98dd-40bf-b025-d396c5d4b7f7 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "interface-370a976e-b51f-4187-bf76-bf6cbcae956b-5008167c-4cca-4768-963e-ab0119180625" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.552 2 DEBUG nova.network.neutron [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.594 2 INFO nova.compute.manager [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Took 1.97 seconds to deallocate network for instance.
Sep 30 21:22:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:18.649 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.651 2 DEBUG nova.compute.manager [req-323492d9-dc76-41db-adee-ee07fb4bb698 req-56e01443-284c-4442-a71b-f60da8af75c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-deleted-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.746 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.747 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.867 2 DEBUG nova.compute.provider_tree [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.885 2 DEBUG nova.scheduler.client.report [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.894 2 DEBUG nova.compute.manager [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.894 2 DEBUG oslo_concurrency.lockutils [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.894 2 DEBUG oslo_concurrency.lockutils [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.894 2 DEBUG oslo_concurrency.lockutils [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.894 2 DEBUG nova.compute.manager [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] No waiting events found dispatching network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.895 2 WARNING nova.compute.manager [req-ae1047c2-50e3-4dd1-8526-e2b2f1c48cd0 req-b35f8eb1-c16a-4d17-9fb7-26bff1c862c1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Received unexpected event network-vif-plugged-ce9edf2f-fd59-406b-b9c3-5b314d995fe9 for instance with vm_state deleted and task_state None.
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.903 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:18 compute-0 nova_compute[192810]: 2025-09-30 21:22:18.945 2 INFO nova.scheduler.client.report [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Deleted allocations for instance 370a976e-b51f-4187-bf76-bf6cbcae956b
Sep 30 21:22:19 compute-0 nova_compute[192810]: 2025-09-30 21:22:19.045 2 DEBUG oslo_concurrency.lockutils [None req-25cfd4c6-b8bb-4eec-91f5-869ee03d3580 c536ea061a32492a8c5e6bf941d1c9f3 47d2c796445c4dd3affc8594502f04be - - default default] Lock "370a976e-b51f-4187-bf76-bf6cbcae956b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:19 compute-0 podman[225366]: 2025-09-30 21:22:19.365918211 +0000 UTC m=+0.089177372 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.223 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.223 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.237 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.292 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.293 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.293 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.293 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.294 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.340 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.341 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.349 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.350 2 INFO nova.compute.claims [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.377 2 INFO nova.compute.manager [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Terminating instance
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.391 2 DEBUG nova.compute.manager [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:21 compute-0 kernel: tap483ff4db-c1 (unregistering): left promiscuous mode
Sep 30 21:22:21 compute-0 NetworkManager[51733]: <info>  [1759267341.4174] device (tap483ff4db-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:21 compute-0 ovn_controller[94912]: 2025-09-30T21:22:21Z|00158|binding|INFO|Releasing lport 483ff4db-c136-4dea-b809-0029e29cee54 from this chassis (sb_readonly=0)
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 ovn_controller[94912]: 2025-09-30T21:22:21Z|00159|binding|INFO|Setting lport 483ff4db-c136-4dea-b809-0029e29cee54 down in Southbound
Sep 30 21:22:21 compute-0 ovn_controller[94912]: 2025-09-30T21:22:21Z|00160|binding|INFO|Removing iface tap483ff4db-c1 ovn-installed in OVS
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.439 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:29:61 10.100.0.13'], port_security=['fa:16:3e:60:29:61 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3d12b579-88e1-415f-b0a2-0e8b17628076', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=483ff4db-c136-4dea-b809-0029e29cee54) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.443 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 483ff4db-c136-4dea-b809-0029e29cee54 in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.446 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.466 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[81cb8539-13a1-41d6-85a4-0d7fb0deac67]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Sep 30 21:22:21 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001f.scope: Consumed 14.511s CPU time.
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.498 2 DEBUG nova.compute.provider_tree [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.497 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[33f787c2-1712-4a54-ad06-69eb32e71837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 systemd-machined[152794]: Machine qemu-13-instance-0000001f terminated.
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.503 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6844a7fa-a042-4665-8e5a-a1d8a6174f30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.511 2 DEBUG nova.scheduler.client.report [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.533 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.534 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.535 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cf5a07-c712-45fa-8d2a-d5c43c5b6631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.556 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[afef39b2-04e1-4258-9803-cee81c2abddf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7ae61a95-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:1a:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396566, 'reachable_time': 17378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225400, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.578 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.578 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8a06c1-5376-467f-8fa0-f846d082e057]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396582, 'tstamp': 396582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225401, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7ae61a95-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396587, 'tstamp': 396587}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225401, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.579 2 DEBUG nova.network.neutron [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.580 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ae61a95-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.588 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.588 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7ae61a95-10, col_values=(('external_ids', {'iface-id': '49c3a39c-1855-4432-8459-a8f7bc7429b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:21.589 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.594 2 INFO nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.617 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.660 2 DEBUG nova.compute.manager [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-unplugged-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.661 2 DEBUG oslo_concurrency.lockutils [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.661 2 DEBUG oslo_concurrency.lockutils [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.661 2 DEBUG oslo_concurrency.lockutils [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.661 2 DEBUG nova.compute.manager [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] No waiting events found dispatching network-vif-unplugged-483ff4db-c136-4dea-b809-0029e29cee54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:21 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.666 2 DEBUG nova.compute.manager [req-b4926c03-0ca3-4c51-a1e7-b24358a589e5 req-c1141b40-18f5-408f-9087-fc83d0c0d12f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-unplugged-483ff4db-c136-4dea-b809-0029e29cee54 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.666 2 INFO nova.virt.libvirt.driver [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Instance destroyed successfully.
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.667 2 DEBUG nova.objects.instance [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'resources' on Instance uuid 3d12b579-88e1-415f-b0a2-0e8b17628076 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.687 2 DEBUG nova.virt.libvirt.vif [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:21:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-965591303',display_name='tempest-ServersAdminTestJSON-server-965591303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-965591303',id=31,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:21:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-hsly4ojy',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:21:24Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=3d12b579-88e1-415f-b0a2-0e8b17628076,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.687 2 DEBUG nova.network.os_vif_util [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "483ff4db-c136-4dea-b809-0029e29cee54", "address": "fa:16:3e:60:29:61", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483ff4db-c1", "ovs_interfaceid": "483ff4db-c136-4dea-b809-0029e29cee54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.688 2 DEBUG nova.network.os_vif_util [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.688 2 DEBUG os_vif [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.690 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap483ff4db-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.697 2 INFO os_vif [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:60:29:61,bridge_name='br-int',has_traffic_filtering=True,id=483ff4db-c136-4dea-b809-0029e29cee54,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483ff4db-c1')
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.698 2 INFO nova.virt.libvirt.driver [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Deleting instance files /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076_del
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.698 2 INFO nova.virt.libvirt.driver [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Deletion of /var/lib/nova/instances/3d12b579-88e1-415f-b0a2-0e8b17628076_del complete
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.766 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.768 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.769 2 INFO nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Creating image(s)
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.769 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.769 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.770 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.782 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.805 2 INFO nova.compute.manager [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.806 2 DEBUG oslo.service.loopingcall [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.807 2 DEBUG nova.compute.manager [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.807 2 DEBUG nova.network.neutron [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.839 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.839 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.840 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.850 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.910 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.911 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.945 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.948 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.948 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.967 2 DEBUG nova.network.neutron [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188
Sep 30 21:22:21 compute-0 nova_compute[192810]: 2025-09-30 21:22:21.968 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.005 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.006 2 DEBUG nova.virt.disk.api [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Checking if we can resize image /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.007 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.063 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.065 2 DEBUG nova.virt.disk.api [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Cannot resize image /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.065 2 DEBUG nova.objects.instance [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'migration_context' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.079 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.080 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Ensure instance console log exists: /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.081 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.081 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.082 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.084 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.091 2 WARNING nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.100 2 DEBUG nova.virt.libvirt.host [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.101 2 DEBUG nova.virt.libvirt.host [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.105 2 DEBUG nova.virt.libvirt.host [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.106 2 DEBUG nova.virt.libvirt.host [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.108 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.108 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.109 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.109 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.109 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.110 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.110 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.110 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.111 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.111 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.111 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.112 2 DEBUG nova.virt.hardware [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.116 2 DEBUG nova.objects.instance [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'pci_devices' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.130 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <uuid>5c749a3a-92bd-47ce-a966-33f62c7e3019</uuid>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <name>instance-00000025</name>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:name>tempest-MigrationsAdminTest-server-565661956</nova:name>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:22:22</nova:creationTime>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:22:22 compute-0 nova_compute[192810]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <system>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="serial">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="uuid">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </system>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <os>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </os>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <features>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </features>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/console.log" append="off"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <video>
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </video>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:22:22 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:22:22 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:22:22 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:22:22 compute-0 nova_compute[192810]: </domain>
Sep 30 21:22:22 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.181 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.181 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.182 2 INFO nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Using config drive
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.386 2 INFO nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Creating config drive at /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.396 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o0zn2t1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.446 2 DEBUG nova.network.neutron [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.471 2 INFO nova.compute.manager [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Took 0.66 seconds to deallocate network for instance.
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.535 2 DEBUG oslo_concurrency.processutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0o0zn2t1" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.581 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.581 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:22 compute-0 systemd-machined[152794]: New machine qemu-17-instance-00000025.
Sep 30 21:22:22 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000025.
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.675 2 DEBUG nova.compute.provider_tree [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.691 2 DEBUG nova.scheduler.client.report [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:22 compute-0 sshd[128205]: drop connection #1 from [49.64.169.153]:57288 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.708 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.736 2 INFO nova.scheduler.client.report [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Deleted allocations for instance 3d12b579-88e1-415f-b0a2-0e8b17628076
Sep 30 21:22:22 compute-0 nova_compute[192810]: 2025-09-30 21:22:22.876 2 DEBUG oslo_concurrency.lockutils [None req-5f950a61-203a-44a6-9dcb-aed604f47a6e fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.583s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:23 compute-0 podman[225463]: 2025-09-30 21:22:23.347306638 +0000 UTC m=+0.074228409 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:22:23 compute-0 podman[225464]: 2025-09-30 21:22:23.349920083 +0000 UTC m=+0.082698360 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.689 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267343.6887214, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.690 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Resumed (Lifecycle Event)
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.694 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.694 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.699 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance spawned successfully.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.703 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.711 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.714 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.725 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.726 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.726 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.727 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.728 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.729 2 DEBUG nova.virt.libvirt.driver [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.741 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.742 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267343.6890318, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.742 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Started (Lifecycle Event)
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.763 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 DEBUG nova.compute.manager [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 DEBUG oslo_concurrency.lockutils [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 DEBUG oslo_concurrency.lockutils [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 DEBUG oslo_concurrency.lockutils [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d12b579-88e1-415f-b0a2-0e8b17628076-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 DEBUG nova.compute.manager [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] No waiting events found dispatching network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.766 2 WARNING nova.compute.manager [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received unexpected event network-vif-plugged-483ff4db-c136-4dea-b809-0029e29cee54 for instance with vm_state deleted and task_state None.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.767 2 DEBUG nova.compute.manager [req-3a8976f0-1a6b-4bcd-834f-652fcaa2c832 req-f4136303-ba45-420c-bbca-67f3e82c1ecd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Received event network-vif-deleted-483ff4db-c136-4dea-b809-0029e29cee54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.770 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.828 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.836 2 INFO nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Took 2.07 seconds to spawn the instance on the hypervisor.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.836 2 DEBUG nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.930 2 INFO nova.compute.manager [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Took 2.64 seconds to build instance.
Sep 30 21:22:23 compute-0 nova_compute[192810]: 2025-09-30 21:22:23.944 2 DEBUG oslo_concurrency.lockutils [None req-361dd465-d9e4-4a34-b30f-66ac24721850 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:25 compute-0 nova_compute[192810]: 2025-09-30 21:22:25.986 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:25 compute-0 nova_compute[192810]: 2025-09-30 21:22:25.986 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:25 compute-0 nova_compute[192810]: 2025-09-30 21:22:25.987 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:25 compute-0 nova_compute[192810]: 2025-09-30 21:22:25.987 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:25 compute-0 nova_compute[192810]: 2025-09-30 21:22:25.988 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.002 2 INFO nova.compute.manager [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Terminating instance
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.017 2 DEBUG nova.compute.manager [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:26 compute-0 kernel: tap0ad4c684-45 (unregistering): left promiscuous mode
Sep 30 21:22:26 compute-0 NetworkManager[51733]: <info>  [1759267346.0477] device (tap0ad4c684-45): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00161|binding|INFO|Releasing lport 0ad4c684-4514-4550-b708-6339f933766c from this chassis (sb_readonly=0)
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00162|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c down in Southbound
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00163|binding|INFO|Removing iface tap0ad4c684-45 ovn-installed in OVS
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.123 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.124 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.125 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.139 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[57489ce2-128e-4afc-a03a-cd9bb00d30d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.140 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb namespace which is not needed anymore
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 21:22:26 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Consumed 11.252s CPU time.
Sep 30 21:22:26 compute-0 systemd-machined[152794]: Machine qemu-16-instance-0000001c terminated.
Sep 30 21:22:26 compute-0 systemd-udevd[225505]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:22:26 compute-0 kernel: tap0ad4c684-45: entered promiscuous mode
Sep 30 21:22:26 compute-0 kernel: tap0ad4c684-45 (unregistering): left promiscuous mode
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00164|binding|INFO|Claiming lport 0ad4c684-4514-4550-b708-6339f933766c for this chassis.
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00165|binding|INFO|0ad4c684-4514-4550-b708-6339f933766c: Claiming fa:16:3e:c1:12:42 10.100.0.14
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.262 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00166|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c ovn-installed in OVS
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00167|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c up in Southbound
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00168|binding|INFO|Releasing lport 0ad4c684-4514-4550-b708-6339f933766c from this chassis (sb_readonly=1)
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00169|binding|INFO|Removing iface tap0ad4c684-45 ovn-installed in OVS
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00170|if_status|INFO|Dropped 2 log messages in last 214 seconds (most recently, 214 seconds ago) due to excessive rate
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00171|if_status|INFO|Not setting lport 0ad4c684-4514-4550-b708-6339f933766c down as sb is readonly
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00172|binding|INFO|Releasing lport 0ad4c684-4514-4550-b708-6339f933766c from this chassis (sb_readonly=0)
Sep 30 21:22:26 compute-0 ovn_controller[94912]: 2025-09-30T21:22:26Z|00173|binding|INFO|Setting lport 0ad4c684-4514-4550-b708-6339f933766c down in Southbound
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.305 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:12:42 10.100.0.14'], port_security=['fa:16:3e:c1:12:42 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c79f1121-fcfb-4c07-94cf-1389e1df9e81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54fb4b43d65542abbb73044c6d52da8a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0c0ea76a-15be-4f11-88b8-d75d482afe6f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c761993a-b179-4f1b-8dc7-b7ebb1a3e32b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=0ad4c684-4514-4550-b708-6339f933766c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:26 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [NOTICE]   (224418) : haproxy version is 2.8.14-c23fe91
Sep 30 21:22:26 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [NOTICE]   (224418) : path to executable is /usr/sbin/haproxy
Sep 30 21:22:26 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [WARNING]  (224418) : Exiting Master process...
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [ALERT]    (224418) : Current worker (224420) exited with code 143 (Terminated)
Sep 30 21:22:26 compute-0 neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb[224382]: [WARNING]  (224418) : All workers exited. Exiting... (0)
Sep 30 21:22:26 compute-0 systemd[1]: libpod-d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744.scope: Deactivated successfully.
Sep 30 21:22:26 compute-0 podman[225529]: 2025-09-30 21:22:26.319992551 +0000 UTC m=+0.058336393 container died d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.330 2 INFO nova.virt.libvirt.driver [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Instance destroyed successfully.
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.330 2 DEBUG nova.objects.instance [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lazy-loading 'resources' on Instance uuid c79f1121-fcfb-4c07-94cf-1389e1df9e81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.343 2 DEBUG nova.virt.libvirt.vif [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-988999069',display_name='tempest-ServersAdminTestJSON-server-988999069',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-988999069',id=28,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:22:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='54fb4b43d65542abbb73044c6d52da8a',ramdisk_id='',reservation_id='r-1cnrm9nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-302955212',owner_user_name='tempest-ServersAdminTestJSON-302955212-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:22:17Z,user_data=None,user_id='fdcdab028fdf46d7ba6634c631d3d33c',uuid=c79f1121-fcfb-4c07-94cf-1389e1df9e81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.343 2 DEBUG nova.network.os_vif_util [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converting VIF {"id": "0ad4c684-4514-4550-b708-6339f933766c", "address": "fa:16:3e:c1:12:42", "network": {"id": "7ae61a95-12e5-48ee-93a0-85e12f8652eb", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-47213222-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "54fb4b43d65542abbb73044c6d52da8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ad4c684-45", "ovs_interfaceid": "0ad4c684-4514-4550-b708-6339f933766c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.344 2 DEBUG nova.network.os_vif_util [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.345 2 DEBUG os_vif [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.347 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ad4c684-45, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.352 2 INFO os_vif [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:12:42,bridge_name='br-int',has_traffic_filtering=True,id=0ad4c684-4514-4550-b708-6339f933766c,network=Network(7ae61a95-12e5-48ee-93a0-85e12f8652eb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ad4c684-45')
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.353 2 INFO nova.virt.libvirt.driver [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deleting instance files /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.354 2 INFO nova.virt.libvirt.driver [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deletion of /var/lib/nova/instances/c79f1121-fcfb-4c07-94cf-1389e1df9e81_del complete
Sep 30 21:22:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744-userdata-shm.mount: Deactivated successfully.
Sep 30 21:22:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-ccc6a3900813832a587c2b27b5dfc3b3d0b4d0a098f4a9e4232685c8017ad3c0-merged.mount: Deactivated successfully.
Sep 30 21:22:26 compute-0 podman[225529]: 2025-09-30 21:22:26.371887984 +0000 UTC m=+0.110231826 container cleanup d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:22:26 compute-0 systemd[1]: libpod-conmon-d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744.scope: Deactivated successfully.
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.413 2 INFO nova.compute.manager [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.413 2 DEBUG oslo.service.loopingcall [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.414 2 DEBUG nova.compute.manager [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.414 2 DEBUG nova.network.neutron [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:22:26 compute-0 podman[225567]: 2025-09-30 21:22:26.453069205 +0000 UTC m=+0.050149840 container remove d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.460 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5c54a57a-45f3-43b4-9784-d5bb6563d6ad]: (4, ('Tue Sep 30 09:22:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb (d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744)\nd65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744\nTue Sep 30 09:22:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb (d65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744)\nd65533bba0e2ce22edafd1667882a8dc9da6ef64a34d6a520e7cc9b94b62a744\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.463 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[62ec275b-0fae-41a4-a635-d44dd97d521b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.464 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ae61a95-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 kernel: tap7ae61a95-10: left promiscuous mode
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.478 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[82760011-0296-4893-a4cb-07a3620cffca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.509 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2273fa90-d3a8-4ce9-a8d0-880f43e2f3d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.510 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfe39fd-0d43-466b-bfae-e38d9292a059]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.530 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[be99a00e-bb24-49f5-a3a8-408b8ca2a8b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396559, 'reachable_time': 32973, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225582, 'error': None, 'target': 'ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d7ae61a95\x2d12e5\x2d48ee\x2d93a0\x2d85e12f8652eb.mount: Deactivated successfully.
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.536 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7ae61a95-12e5-48ee-93a0-85e12f8652eb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.536 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[3935cca7-a85a-48e4-87e2-e751a3ea9a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.538 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.540 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.541 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7bde05e9-f7b6-4620-adff-2c6585e7c033]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.542 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 0ad4c684-4514-4550-b708-6339f933766c in datapath 7ae61a95-12e5-48ee-93a0-85e12f8652eb unbound from our chassis
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.543 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7ae61a95-12e5-48ee-93a0-85e12f8652eb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:22:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:26.544 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9dd0d5-6b47-41e4-8a90-af06ea8ed663]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.748 2 DEBUG nova.compute.manager [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.748 2 DEBUG oslo_concurrency.lockutils [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.748 2 DEBUG oslo_concurrency.lockutils [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.749 2 DEBUG oslo_concurrency.lockutils [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.749 2 DEBUG nova.compute.manager [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:26 compute-0 nova_compute[192810]: 2025-09-30 21:22:26.749 2 DEBUG nova.compute.manager [req-ce174868-852d-4cc5-bb75-056c469cc4c5 req-e93aa76f-0b37-4163-971e-e87a369b12ca dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:22:27 compute-0 nova_compute[192810]: 2025-09-30 21:22:27.990 2 DEBUG nova.network.neutron [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.020 2 INFO nova.compute.manager [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Took 1.61 seconds to deallocate network for instance.
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.080 2 DEBUG nova.compute.manager [req-2d23ef7b-b885-4e1c-89dc-fb8ddcfa75a9 req-e333e020-c3de-4b62-ae0f-0abdfbd7a526 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-deleted-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.110 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.110 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.207 2 DEBUG nova.compute.provider_tree [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.234 2 DEBUG nova.scheduler.client.report [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.260 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.293 2 INFO nova.scheduler.client.report [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Deleted allocations for instance c79f1121-fcfb-4c07-94cf-1389e1df9e81
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.408 2 DEBUG oslo_concurrency.lockutils [None req-b0d84102-afcc-443c-a700-f315313a1215 fdcdab028fdf46d7ba6634c631d3d33c 54fb4b43d65542abbb73044c6d52da8a - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 sshd-session[225583]: Invalid user groot from 45.81.23.80 port 45030
Sep 30 21:22:28 compute-0 sshd-session[225583]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:22:28 compute-0 sshd-session[225583]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.839 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.840 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquired lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.840 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.872 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.873 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.873 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.874 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.874 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.874 2 WARNING nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state deleted and task_state None.
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.874 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.875 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.875 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.875 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.875 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.876 2 WARNING nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state deleted and task_state None.
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.876 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.876 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.876 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.877 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.877 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.877 2 WARNING nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state deleted and task_state None.
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.877 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.878 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.878 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.878 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.878 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.879 2 WARNING nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-unplugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state deleted and task_state None.
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.879 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.879 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.879 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.880 2 DEBUG oslo_concurrency.lockutils [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c79f1121-fcfb-4c07-94cf-1389e1df9e81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.880 2 DEBUG nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] No waiting events found dispatching network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:22:28 compute-0 nova_compute[192810]: 2025-09-30 21:22:28.880 2 WARNING nova.compute.manager [req-418e886a-5bca-4a70-9fea-bd4a76caeca5 req-afbfe14e-cb1b-492e-82f7-0a5ac71bfdd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Received unexpected event network-vif-plugged-0ad4c684-4514-4550-b708-6339f933766c for instance with vm_state deleted and task_state None.
Sep 30 21:22:29 compute-0 nova_compute[192810]: 2025-09-30 21:22:29.256 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.275 2 DEBUG nova.network.neutron [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.307 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Releasing lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:30 compute-0 sshd-session[225583]: Failed password for invalid user groot from 45.81.23.80 port 45030 ssh2
Sep 30 21:22:30 compute-0 podman[225586]: 2025-09-30 21:22:30.387334579 +0000 UTC m=+0.105956359 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:22:30 compute-0 podman[225585]: 2025-09-30 21:22:30.440831031 +0000 UTC m=+0.158141588 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.482 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.483 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Creating file /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/9c89cfa250bc409fa2dc0fbaefc953d3.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.483 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/9c89cfa250bc409fa2dc0fbaefc953d3.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.977 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/9c89cfa250bc409fa2dc0fbaefc953d3.tmp" returned: 1 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.979 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/9c89cfa250bc409fa2dc0fbaefc953d3.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.980 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Creating directory /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:22:30 compute-0 nova_compute[192810]: 2025-09-30 21:22:30.981 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.231 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019" returned: 0 in 0.250s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.237 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.514 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267336.5135152, 370a976e-b51f-4187-bf76-bf6cbcae956b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.515 2 INFO nova.compute.manager [-] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] VM Stopped (Lifecycle Event)
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.550 2 DEBUG nova.compute.manager [None req-19503da8-f3ce-421d-89f3-bf82242f6665 - - - - - -] [instance: 370a976e-b51f-4187-bf76-bf6cbcae956b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:31 compute-0 nova_compute[192810]: 2025-09-30 21:22:31.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:31 compute-0 sshd-session[225583]: Received disconnect from 45.81.23.80 port 45030:11: Bye Bye [preauth]
Sep 30 21:22:31 compute-0 sshd-session[225583]: Disconnected from invalid user groot 45.81.23.80 port 45030 [preauth]
Sep 30 21:22:34 compute-0 nova_compute[192810]: 2025-09-30 21:22:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:36 compute-0 podman[225646]: 2025-09-30 21:22:36.335777561 +0000 UTC m=+0.070169498 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:22:36 compute-0 nova_compute[192810]: 2025-09-30 21:22:36.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:36 compute-0 nova_compute[192810]: 2025-09-30 21:22:36.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:36 compute-0 nova_compute[192810]: 2025-09-30 21:22:36.658 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267341.6578639, 3d12b579-88e1-415f-b0a2-0e8b17628076 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:36 compute-0 nova_compute[192810]: 2025-09-30 21:22:36.659 2 INFO nova.compute.manager [-] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] VM Stopped (Lifecycle Event)
Sep 30 21:22:36 compute-0 nova_compute[192810]: 2025-09-30 21:22:36.682 2 DEBUG nova.compute.manager [None req-d86a6fe2-dd6c-4cd9-8cae-be0daa60a82a - - - - - -] [instance: 3d12b579-88e1-415f-b0a2-0e8b17628076] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:38.727 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:38.727 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:38.727 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:38 compute-0 nova_compute[192810]: 2025-09-30 21:22:38.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:38 compute-0 nova_compute[192810]: 2025-09-30 21:22:38.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:38 compute-0 nova_compute[192810]: 2025-09-30 21:22:38.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:38 compute-0 nova_compute[192810]: 2025-09-30 21:22:38.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.284 2 DEBUG nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.327 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267346.3266668, c79f1121-fcfb-4c07-94cf-1389e1df9e81 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.328 2 INFO nova.compute.manager [-] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] VM Stopped (Lifecycle Event)
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.350 2 DEBUG nova.compute.manager [None req-e606a2f3-10f5-4027-b1ea-7315beb8bf62 - - - - - -] [instance: c79f1121-fcfb-4c07-94cf-1389e1df9e81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.829 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.830 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.884 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.938 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:41 compute-0 nova_compute[192810]: 2025-09-30 21:22:41.938 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.008 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.158 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.159 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.3996810913086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.159 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.160 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.223 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating resource usage from migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.252 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.252 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.253 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.295 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.313 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.339 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:22:42 compute-0 nova_compute[192810]: 2025-09-30 21:22:42.339 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:43 compute-0 podman[225673]: 2025-09-30 21:22:43.321308165 +0000 UTC m=+0.060304492 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:22:43 compute-0 nova_compute[192810]: 2025-09-30 21:22:43.340 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:43 compute-0 nova_compute[192810]: 2025-09-30 21:22:43.340 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:22:43 compute-0 podman[225672]: 2025-09-30 21:22:43.342217416 +0000 UTC m=+0.080341021 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:22:43 compute-0 nova_compute[192810]: 2025-09-30 21:22:43.410 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:22:43 compute-0 nova_compute[192810]: 2025-09-30 21:22:43.410 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:43 compute-0 nova_compute[192810]: 2025-09-30 21:22:43.410 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:22:43 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Deactivated successfully.
Sep 30 21:22:43 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Consumed 12.673s CPU time.
Sep 30 21:22:43 compute-0 systemd-machined[152794]: Machine qemu-17-instance-00000025 terminated.
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.906 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5c749a3a-92bd-47ce-a966-33f62c7e3019', 'name': 'tempest-MigrationsAdminTest-server-565661956', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'hostId': '3db9e547ab0bd20efc83f4476016fcc9a3e113db01aa0c42deec3089', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.907 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.908 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.908 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.909 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.909 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.909 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>]
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.910 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>]
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>]
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.912 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.912 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.913 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.914 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.914 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.914 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.915 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.915 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.916 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.916 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.916 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-MigrationsAdminTest-server-565661956>]
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.917 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.917 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.918 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.919 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.919 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.921 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:22:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:22:43.922 12 DEBUG ceilometer.compute.pollsters [-] Instance 5c749a3a-92bd-47ce-a966-33f62c7e3019 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000025, id=5c749a3a-92bd-47ce-a966-33f62c7e3019>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.299 2 INFO nova.virt.libvirt.driver [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance shutdown successfully after 13 seconds.
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.305 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance destroyed successfully.
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.309 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.386 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.388 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.478 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.480 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk to 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:44 compute-0 nova_compute[192810]: 2025-09-30 21:22:44.481 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.259 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk" returned: 0 in 0.779s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.261 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.261 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.config 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.480 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -C -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.config 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config" returned: 0 in 0.219s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.481 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Copying file /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.481 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.info 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.685 2 DEBUG oslo_concurrency.processutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] CMD "scp -C -r /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_resize/disk.info 192.168.122.101:/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" returned: 0 in 0.204s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.820 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Acquiring lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.821 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:45 compute-0 nova_compute[192810]: 2025-09-30 21:22:45.821 2 DEBUG oslo_concurrency.lockutils [None req-cbfb2671-692a-499c-8f85-bf49dcd83b39 476479e1b5f54a929b25f8cdbf203fd4 7f91319abc714add97d8e6464a47dd50 - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:46 compute-0 nova_compute[192810]: 2025-09-30 21:22:46.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:46 compute-0 nova_compute[192810]: 2025-09-30 21:22:46.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:50 compute-0 podman[225737]: 2025-09-30 21:22:50.33652814 +0000 UTC m=+0.067107982 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Sep 30 21:22:51 compute-0 nova_compute[192810]: 2025-09-30 21:22:51.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:51 compute-0 nova_compute[192810]: 2025-09-30 21:22:51.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.192 2 INFO nova.compute.manager [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Swapping old allocation on dict_keys(['fe423b93-de5a-41f7-97d1-9622ea46af54']) held by migration 5013b103-02ec-49ec-a8fd-5104ae6c9ef4 for instance
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.219 2 DEBUG nova.scheduler.client.report [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Overwriting current allocation {'allocations': {'e551d5b4-e9f6-409e-b2a1-508a20c11333': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 36}}, 'project_id': '7c15359849554c2382315de9f52125af', 'user_id': 'f11be289e05b49a7809cb1a3523abc0c', 'consumer_generation': 1} on consumer 5c749a3a-92bd-47ce-a966-33f62c7e3019 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.459 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.459 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.459 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:52 compute-0 nova_compute[192810]: 2025-09-30 21:22:52.751 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.331 2 DEBUG nova.network.neutron [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.359 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.361 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.372 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.377 2 WARNING nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.383 2 DEBUG nova.virt.libvirt.host [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.384 2 DEBUG nova.virt.libvirt.host [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.388 2 DEBUG nova.virt.libvirt.host [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.389 2 DEBUG nova.virt.libvirt.host [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.391 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.392 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.392 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.393 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.393 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.394 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.394 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.395 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.395 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.396 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.396 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.397 2 DEBUG nova.virt.hardware [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.397 2 DEBUG nova.objects.instance [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'vcpu_model' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.421 2 DEBUG oslo_concurrency.processutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.511 2 DEBUG oslo_concurrency.processutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.514 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.514 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.516 2 DEBUG oslo_concurrency.lockutils [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:53 compute-0 nova_compute[192810]: 2025-09-30 21:22:53.519 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <uuid>5c749a3a-92bd-47ce-a966-33f62c7e3019</uuid>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <name>instance-00000025</name>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:name>tempest-MigrationsAdminTest-server-565661956</nova:name>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:22:53</nova:creationTime>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:user uuid="f11be289e05b49a7809cb1a3523abc0c">tempest-MigrationsAdminTest-1333693346-project-member</nova:user>
Sep 30 21:22:53 compute-0 nova_compute[192810]:         <nova:project uuid="7c15359849554c2382315de9f52125af">tempest-MigrationsAdminTest-1333693346</nova:project>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <system>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="serial">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="uuid">5c749a3a-92bd-47ce-a966-33f62c7e3019</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </system>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <os>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </os>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <features>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </features>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/disk.config"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019/console.log" append="off"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <video>
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </video>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:22:53 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:22:53 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:22:53 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:22:53 compute-0 nova_compute[192810]: </domain>
Sep 30 21:22:53 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:22:53 compute-0 systemd-machined[152794]: New machine qemu-18-instance-00000025.
Sep 30 21:22:53 compute-0 podman[225761]: 2025-09-30 21:22:53.661218488 +0000 UTC m=+0.086130166 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:22:53 compute-0 podman[225762]: 2025-09-30 21:22:53.665534065 +0000 UTC m=+0.091049338 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:22:53 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000025.
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:54.065 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:22:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:22:54.067 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.343 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 5c749a3a-92bd-47ce-a966-33f62c7e3019 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.343 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267374.3421888, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.344 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Resumed (Lifecycle Event)
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.348 2 DEBUG nova.compute.manager [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.352 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance running successfully.
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.353 2 DEBUG nova.virt.libvirt.driver [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.374 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.377 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.430 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.431 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267374.3425453, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.431 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Started (Lifecycle Event)
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.459 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.463 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.492 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:22:54 compute-0 nova_compute[192810]: 2025-09-30 21:22:54.495 2 INFO nova.compute.manager [None req-0814cb0f-c6cd-4c7d-a435-05e3a9d9ad12 f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance to original state: 'active'
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.969 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.970 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.970 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.970 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.970 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:56 compute-0 nova_compute[192810]: 2025-09-30 21:22:56.986 2 INFO nova.compute.manager [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Terminating instance
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.010 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.011 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquired lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.011 2 DEBUG nova.network.neutron [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.213 2 DEBUG nova.network.neutron [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.562 2 DEBUG nova.network.neutron [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.577 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Releasing lock "refresh_cache-5c749a3a-92bd-47ce-a966-33f62c7e3019" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.577 2 DEBUG nova.compute.manager [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:22:57 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000025.scope: Deactivated successfully.
Sep 30 21:22:57 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000025.scope: Consumed 4.017s CPU time.
Sep 30 21:22:57 compute-0 systemd-machined[152794]: Machine qemu-18-instance-00000025 terminated.
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.828 2 INFO nova.virt.libvirt.driver [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance destroyed successfully.
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.828 2 DEBUG nova.objects.instance [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lazy-loading 'resources' on Instance uuid 5c749a3a-92bd-47ce-a966-33f62c7e3019 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.846 2 INFO nova.virt.libvirt.driver [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Deleting instance files /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_del
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.857 2 INFO nova.virt.libvirt.driver [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Deletion of /var/lib/nova/instances/5c749a3a-92bd-47ce-a966-33f62c7e3019_del complete
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.931 2 INFO nova.compute.manager [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Took 0.35 seconds to destroy the instance on the hypervisor.
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.932 2 DEBUG oslo.service.loopingcall [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.932 2 DEBUG nova.compute.manager [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:22:57 compute-0 nova_compute[192810]: 2025-09-30 21:22:57.933 2 DEBUG nova.network.neutron [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.120 2 DEBUG nova.network.neutron [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.134 2 DEBUG nova.network.neutron [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.151 2 INFO nova.compute.manager [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Took 0.22 seconds to deallocate network for instance.
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.226 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.226 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.279 2 DEBUG nova.compute.provider_tree [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.307 2 DEBUG nova.scheduler.client.report [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.338 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.392 2 INFO nova.scheduler.client.report [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Deleted allocations for instance 5c749a3a-92bd-47ce-a966-33f62c7e3019
Sep 30 21:22:58 compute-0 nova_compute[192810]: 2025-09-30 21:22:58.475 2 DEBUG oslo_concurrency.lockutils [None req-e534340d-bccd-4f07-914e-7f417f3023da f11be289e05b49a7809cb1a3523abc0c 7c15359849554c2382315de9f52125af - - default default] Lock "5c749a3a-92bd-47ce-a966-33f62c7e3019" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:00 compute-0 nova_compute[192810]: 2025-09-30 21:23:00.910 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:00 compute-0 nova_compute[192810]: 2025-09-30 21:23:00.910 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:00 compute-0 nova_compute[192810]: 2025-09-30 21:23:00.931 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.042 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.043 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.052 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.052 2 INFO nova.compute.claims [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.228 2 DEBUG nova.compute.provider_tree [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.243 2 DEBUG nova.scheduler.client.report [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.264 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.265 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.334 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.334 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.358 2 INFO nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:23:01 compute-0 podman[225839]: 2025-09-30 21:23:01.371206559 +0000 UTC m=+0.090300999 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.378 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:23:01 compute-0 podman[225838]: 2025-09-30 21:23:01.397687958 +0000 UTC m=+0.119755892 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.492 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.494 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.494 2 INFO nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Creating image(s)
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.495 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.496 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.496 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.523 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.596 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.598 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.599 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.611 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.702 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.704 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.740 2 DEBUG nova.policy [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73ad6b75271d46c4b9b117faafb3e95e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.744 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.745 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.745 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.800 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.801 2 DEBUG nova.virt.disk.api [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Checking if we can resize image /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.802 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.854 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.856 2 DEBUG nova.virt.disk.api [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Cannot resize image /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.857 2 DEBUG nova.objects.instance [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'migration_context' on Instance uuid 5da25813-a9d3-49a0-80ae-59c06fef8440 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.870 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.871 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Ensure instance console log exists: /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.871 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.872 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:01 compute-0 nova_compute[192810]: 2025-09-30 21:23:01.872 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:02 compute-0 nova_compute[192810]: 2025-09-30 21:23:02.404 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Successfully created port: bea64634-2634-4763-abcc-3935baa9761c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.245 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Successfully updated port: bea64634-2634-4763-abcc-3935baa9761c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.258 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.259 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.259 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.349 2 DEBUG nova.compute.manager [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-changed-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.350 2 DEBUG nova.compute.manager [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing instance network info cache due to event network-changed-bea64634-2634-4763-abcc-3935baa9761c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.350 2 DEBUG oslo_concurrency.lockutils [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:03 compute-0 nova_compute[192810]: 2025-09-30 21:23:03.477 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:04 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:04.070 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.459 2 DEBUG nova.network.neutron [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.477 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.478 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Instance network_info: |[{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.479 2 DEBUG oslo_concurrency.lockutils [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.479 2 DEBUG nova.network.neutron [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing network info cache for port bea64634-2634-4763-abcc-3935baa9761c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.484 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Start _get_guest_xml network_info=[{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.489 2 WARNING nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.496 2 DEBUG nova.virt.libvirt.host [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.496 2 DEBUG nova.virt.libvirt.host [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.502 2 DEBUG nova.virt.libvirt.host [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.502 2 DEBUG nova.virt.libvirt.host [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.504 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.504 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.505 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.505 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.505 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.506 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.506 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.506 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.506 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.507 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.507 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.507 2 DEBUG nova.virt.hardware [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.511 2 DEBUG nova.virt.libvirt.vif [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1607988543',display_name='tempest-FloatingIPsAssociationTestJSON-server-1607988543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1607988543',id=38,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-7how27dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:01Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=5da25813-a9d3-49a0-80ae-59c06fef8440,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.512 2 DEBUG nova.network.os_vif_util [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.513 2 DEBUG nova.network.os_vif_util [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.514 2 DEBUG nova.objects.instance [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'pci_devices' on Instance uuid 5da25813-a9d3-49a0-80ae-59c06fef8440 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.536 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <uuid>5da25813-a9d3-49a0-80ae-59c06fef8440</uuid>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <name>instance-00000026</name>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-1607988543</nova:name>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:23:04</nova:creationTime>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:user uuid="73ad6b75271d46c4b9b117faafb3e95e">tempest-FloatingIPsAssociationTestJSON-122452251-project-member</nova:user>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:project uuid="a09110a6c4d740edbafe08d6c1782a1d">tempest-FloatingIPsAssociationTestJSON-122452251</nova:project>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         <nova:port uuid="bea64634-2634-4763-abcc-3935baa9761c">
Sep 30 21:23:04 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <system>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="serial">5da25813-a9d3-49a0-80ae-59c06fef8440</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="uuid">5da25813-a9d3-49a0-80ae-59c06fef8440</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </system>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <os>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </os>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <features>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </features>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.config"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:1c:5c:86"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <target dev="tapbea64634-26"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/console.log" append="off"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <video>
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </video>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:23:04 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:23:04 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:23:04 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:23:04 compute-0 nova_compute[192810]: </domain>
Sep 30 21:23:04 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.537 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Preparing to wait for external event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.538 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.538 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.539 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.539 2 DEBUG nova.virt.libvirt.vif [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1607988543',display_name='tempest-FloatingIPsAssociationTestJSON-server-1607988543',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1607988543',id=38,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-7how27dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:01Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=5da25813-a9d3-49a0-80ae-59c06fef8440,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.540 2 DEBUG nova.network.os_vif_util [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.540 2 DEBUG nova.network.os_vif_util [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.541 2 DEBUG os_vif [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.542 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbea64634-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbea64634-26, col_values=(('external_ids', {'iface-id': 'bea64634-2634-4763-abcc-3935baa9761c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:5c:86', 'vm-uuid': '5da25813-a9d3-49a0-80ae-59c06fef8440'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:04 compute-0 NetworkManager[51733]: <info>  [1759267384.5521] manager: (tapbea64634-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.564 2 INFO os_vif [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26')
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.664 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.666 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.667 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No VIF found with MAC fa:16:3e:1c:5c:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:04 compute-0 nova_compute[192810]: 2025-09-30 21:23:04.668 2 INFO nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Using config drive
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.082 2 INFO nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Creating config drive at /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.config
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.092 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_r16lzsm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.239 2 DEBUG oslo_concurrency.processutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_r16lzsm" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:05 compute-0 kernel: tapbea64634-26: entered promiscuous mode
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.3211] manager: (tapbea64634-26): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Sep 30 21:23:05 compute-0 ovn_controller[94912]: 2025-09-30T21:23:05Z|00174|binding|INFO|Claiming lport bea64634-2634-4763-abcc-3935baa9761c for this chassis.
Sep 30 21:23:05 compute-0 ovn_controller[94912]: 2025-09-30T21:23:05Z|00175|binding|INFO|bea64634-2634-4763-abcc-3935baa9761c: Claiming fa:16:3e:1c:5c:86 10.100.0.12
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.341 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:5c:86 10.100.0.12'], port_security=['fa:16:3e:1c:5c:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5da25813-a9d3-49a0-80ae-59c06fef8440', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-100affc1-3472-41d6-950b-1802616b458e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3501d94-e703-4b91-b4a2-8e0ccdf2600f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be15a9ee-86e7-4792-9a15-1cba7298ecf1, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bea64634-2634-4763-abcc-3935baa9761c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.343 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bea64634-2634-4763-abcc-3935baa9761c in datapath 100affc1-3472-41d6-950b-1802616b458e bound to our chassis
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.346 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 100affc1-3472-41d6-950b-1802616b458e
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.358 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7a8017-87de-4300-82b9-b8e3f1d314df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.360 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap100affc1-31 in ovnmeta-100affc1-3472-41d6-950b-1802616b458e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.361 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap100affc1-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.361 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5cba7771-29a8-4917-9b79-7756a1ec0039]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.362 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[074decf0-73d3-4a6b-b609-837dc225adbd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 systemd-udevd[225921]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:05 compute-0 systemd-machined[152794]: New machine qemu-19-instance-00000026.
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.376 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[33a54305-08a2-4d12-a20c-14c8ec1b7d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.3852] device (tapbea64634-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.3872] device (tapbea64634-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.417 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f62bfa-7af1-41ea-b36e-7ec2e0113e8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_controller[94912]: 2025-09-30T21:23:05Z|00176|binding|INFO|Setting lport bea64634-2634-4763-abcc-3935baa9761c ovn-installed in OVS
Sep 30 21:23:05 compute-0 ovn_controller[94912]: 2025-09-30T21:23:05Z|00177|binding|INFO|Setting lport bea64634-2634-4763-abcc-3935baa9761c up in Southbound
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000026.
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.469 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ba8b6b-d61e-4102-ad18-46e31d39ed0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.476 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[298d2456-7b9d-4a76-944c-246e667d1f1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.4769] manager: (tap100affc1-30): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Sep 30 21:23:05 compute-0 systemd-udevd[225926]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.510 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8e4b6246-bb33-43de-87fe-568dc616fee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.513 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[16df024c-d8d1-4d7f-b17b-d219a55dcc1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.5428] device (tap100affc1-30): carrier: link connected
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.549 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8fed8c-fe8c-448b-bffc-0ead914303d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.571 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ccce6902-428a-440d-b066-5c559b7ecc3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap100affc1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:7c:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408116, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225956, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.590 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0570f4-5c23-4a4a-813d-b9daf5acd125]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe78:7c1e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408116, 'tstamp': 408116}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225957, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.613 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[61739672-35da-4c1d-891d-8ec1661ff958]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap100affc1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:7c:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408116, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225958, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.648 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fb3011-30cd-4cc0-a7a3-c17acd4194dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.704 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[058dab2e-f4d6-4e61-b600-021a5f50e252]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.705 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap100affc1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.705 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.706 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap100affc1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 kernel: tap100affc1-30: entered promiscuous mode
Sep 30 21:23:05 compute-0 NetworkManager[51733]: <info>  [1759267385.7640] manager: (tap100affc1-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.768 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap100affc1-30, col_values=(('external_ids', {'iface-id': '3e3eb48b-c12a-4061-b9d9-c9a7ba21a331'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 ovn_controller[94912]: 2025-09-30T21:23:05Z|00178|binding|INFO|Releasing lport 3e3eb48b-c12a-4061-b9d9-c9a7ba21a331 from this chassis (sb_readonly=0)
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.772 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/100affc1-3472-41d6-950b-1802616b458e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/100affc1-3472-41d6-950b-1802616b458e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.773 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42cf4c23-f1b7-4bec-add9-5b9abf70b5b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.774 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-100affc1-3472-41d6-950b-1802616b458e
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/100affc1-3472-41d6-950b-1802616b458e.pid.haproxy
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 100affc1-3472-41d6-950b-1802616b458e
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:23:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:05.775 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'env', 'PROCESS_TAG=haproxy-100affc1-3472-41d6-950b-1802616b458e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/100affc1-3472-41d6-950b-1802616b458e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.889 2 DEBUG nova.compute.manager [req-efa22a54-e2ad-45df-89d7-9a68b1146407 req-5a1787e4-c3fa-4bd1-b7fa-db212a771975 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.889 2 DEBUG oslo_concurrency.lockutils [req-efa22a54-e2ad-45df-89d7-9a68b1146407 req-5a1787e4-c3fa-4bd1-b7fa-db212a771975 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.890 2 DEBUG oslo_concurrency.lockutils [req-efa22a54-e2ad-45df-89d7-9a68b1146407 req-5a1787e4-c3fa-4bd1-b7fa-db212a771975 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.890 2 DEBUG oslo_concurrency.lockutils [req-efa22a54-e2ad-45df-89d7-9a68b1146407 req-5a1787e4-c3fa-4bd1-b7fa-db212a771975 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:05 compute-0 nova_compute[192810]: 2025-09-30 21:23:05.890 2 DEBUG nova.compute.manager [req-efa22a54-e2ad-45df-89d7-9a68b1146407 req-5a1787e4-c3fa-4bd1-b7fa-db212a771975 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Processing event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:06 compute-0 podman[225990]: 2025-09-30 21:23:06.143069818 +0000 UTC m=+0.055565364 container create c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:23:06 compute-0 systemd[1]: Started libpod-conmon-c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694.scope.
Sep 30 21:23:06 compute-0 podman[225990]: 2025-09-30 21:23:06.112104467 +0000 UTC m=+0.024600033 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:23:06 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:23:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55f924ecda93e0cae2f7ba8f561ba79e6d9778c1e8ac721ef67b713952bf2f88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:23:06 compute-0 podman[225990]: 2025-09-30 21:23:06.245516199 +0000 UTC m=+0.158011765 container init c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:23:06 compute-0 podman[225990]: 2025-09-30 21:23:06.252883833 +0000 UTC m=+0.165379379 container start c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:23:06 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [NOTICE]   (226009) : New worker (226011) forked
Sep 30 21:23:06 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [NOTICE]   (226009) : Loading success.
Sep 30 21:23:06 compute-0 nova_compute[192810]: 2025-09-30 21:23:06.562 2 DEBUG nova.network.neutron [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated VIF entry in instance network info cache for port bea64634-2634-4763-abcc-3935baa9761c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:06 compute-0 nova_compute[192810]: 2025-09-30 21:23:06.563 2 DEBUG nova.network.neutron [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:06 compute-0 nova_compute[192810]: 2025-09-30 21:23:06.581 2 DEBUG oslo_concurrency.lockutils [req-a1acc7c1-561d-4f92-8993-a09941a2e4a9 req-8d25d00e-697b-4403-be88-94c9c896626f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:06 compute-0 nova_compute[192810]: 2025-09-30 21:23:06.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:07 compute-0 podman[226027]: 2025-09-30 21:23:07.314482923 +0000 UTC m=+0.045519374 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.400 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267387.4004164, 5da25813-a9d3-49a0-80ae-59c06fef8440 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.401 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] VM Started (Lifecycle Event)
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.402 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.405 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.408 2 INFO nova.virt.libvirt.driver [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Instance spawned successfully.
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.408 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.427 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.432 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.435 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.435 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.435 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.436 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.436 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.436 2 DEBUG nova.virt.libvirt.driver [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.473 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.474 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267387.4023747, 5da25813-a9d3-49a0-80ae-59c06fef8440 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.474 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] VM Paused (Lifecycle Event)
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.519 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.522 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267387.4049242, 5da25813-a9d3-49a0-80ae-59c06fef8440 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.522 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] VM Resumed (Lifecycle Event)
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.561 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.563 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.578 2 INFO nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Took 6.09 seconds to spawn the instance on the hypervisor.
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.579 2 DEBUG nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.589 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.703 2 INFO nova.compute.manager [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Took 6.70 seconds to build instance.
Sep 30 21:23:07 compute-0 nova_compute[192810]: 2025-09-30 21:23:07.722 2 DEBUG oslo_concurrency.lockutils [None req-7f803598-7b20-4fe9-9f67-46a593fd2fce 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.110 2 DEBUG nova.compute.manager [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.110 2 DEBUG oslo_concurrency.lockutils [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.111 2 DEBUG oslo_concurrency.lockutils [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.111 2 DEBUG oslo_concurrency.lockutils [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.111 2 DEBUG nova.compute.manager [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] No waiting events found dispatching network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:08 compute-0 nova_compute[192810]: 2025-09-30 21:23:08.111 2 WARNING nova.compute.manager [req-0224d5fb-084d-426c-9aa7-ff08a5129985 req-2770af94-3a82-4047-a879-237407944bda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received unexpected event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c for instance with vm_state active and task_state None.
Sep 30 21:23:09 compute-0 nova_compute[192810]: 2025-09-30 21:23:09.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:11 compute-0 nova_compute[192810]: 2025-09-30 21:23:11.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:12 compute-0 nova_compute[192810]: 2025-09-30 21:23:12.827 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267377.826011, 5c749a3a-92bd-47ce-a966-33f62c7e3019 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:12 compute-0 nova_compute[192810]: 2025-09-30 21:23:12.827 2 INFO nova.compute.manager [-] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] VM Stopped (Lifecycle Event)
Sep 30 21:23:12 compute-0 nova_compute[192810]: 2025-09-30 21:23:12.848 2 DEBUG nova.compute.manager [None req-c2e21ca3-7ed4-43b6-8f18-7642e17ad193 - - - - - -] [instance: 5c749a3a-92bd-47ce-a966-33f62c7e3019] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:14 compute-0 podman[226047]: 2025-09-30 21:23:14.304858388 +0000 UTC m=+0.047441901 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:23:14 compute-0 podman[226048]: 2025-09-30 21:23:14.314679923 +0000 UTC m=+0.053921573 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:23:14 compute-0 nova_compute[192810]: 2025-09-30 21:23:14.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.348 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.348 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.365 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.488 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.489 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.495 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.496 2 INFO nova.compute.claims [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.658 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "246eff24-0065-40dd-87cf-c31afb91539f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.659 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.665 2 DEBUG nova.compute.provider_tree [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.679 2 DEBUG nova.scheduler.client.report [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.682 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.716 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.717 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.790 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.791 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.807 2 INFO nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.823 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.823 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.831 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.831 2 INFO nova.compute.claims [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.836 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.962 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.964 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.965 2 INFO nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Creating image(s)
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.966 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.966 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.968 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:15 compute-0 nova_compute[192810]: 2025-09-30 21:23:15.997 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.051 2 DEBUG nova.compute.provider_tree [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.070 2 DEBUG nova.scheduler.client.report [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.075 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.076 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.077 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.092 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.113 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.114 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.151 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.152 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.184 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.185 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.186 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.187 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.203 2 INFO nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.219 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.238 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.239 2 DEBUG nova.virt.disk.api [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Checking if we can resize image /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.240 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.319 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.321 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.321 2 INFO nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating image(s)
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.322 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.322 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.323 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.339 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.340 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.358 2 DEBUG nova.virt.disk.api [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Cannot resize image /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.359 2 DEBUG nova.objects.instance [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'migration_context' on Instance uuid 3585ec48-d3dc-4467-be42-206c7784d55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.378 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.379 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Ensure instance console log exists: /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.380 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.380 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.381 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.391 2 DEBUG nova.policy [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '73ad6b75271d46c4b9b117faafb3e95e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.404 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.405 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.406 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.421 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.495 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.497 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.539 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.540 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.541 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.605 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.608 2 DEBUG nova.virt.disk.api [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Checking if we can resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.609 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.677 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.678 2 DEBUG nova.virt.disk.api [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Cannot resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.679 2 DEBUG nova.objects.instance [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'migration_context' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.692 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.692 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Ensure instance console log exists: /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.693 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.693 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.694 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.695 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.699 2 WARNING nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.703 2 DEBUG nova.virt.libvirt.host [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.704 2 DEBUG nova.virt.libvirt.host [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.708 2 DEBUG nova.virt.libvirt.host [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.709 2 DEBUG nova.virt.libvirt.host [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.710 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.710 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.710 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.711 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.711 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.711 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.712 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.712 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.712 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.712 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.713 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.713 2 DEBUG nova.virt.hardware [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.716 2 DEBUG nova.objects.instance [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'pci_devices' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.733 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <uuid>246eff24-0065-40dd-87cf-c31afb91539f</uuid>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <name>instance-0000002a</name>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdmin275Test-server-1639195055</nova:name>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:23:16</nova:creationTime>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:user uuid="80508b28c98a49d09d4a47dd6403e6fe">tempest-ServersAdmin275Test-674052928-project-member</nova:user>
Sep 30 21:23:16 compute-0 nova_compute[192810]:         <nova:project uuid="a5f7cbca41c24c7fac661cac94fdf2aa">tempest-ServersAdmin275Test-674052928</nova:project>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <system>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="serial">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="uuid">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </system>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <os>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </os>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <features>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </features>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log" append="off"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <video>
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </video>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:23:16 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:23:16 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:23:16 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:23:16 compute-0 nova_compute[192810]: </domain>
Sep 30 21:23:16 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.787 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.788 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:16 compute-0 nova_compute[192810]: 2025-09-30 21:23:16.788 2 INFO nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Using config drive
Sep 30 21:23:17 compute-0 nova_compute[192810]: 2025-09-30 21:23:17.340 2 INFO nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating config drive at /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config
Sep 30 21:23:17 compute-0 nova_compute[192810]: 2025-09-30 21:23:17.344 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbqxoecat execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:17 compute-0 nova_compute[192810]: 2025-09-30 21:23:17.475 2 DEBUG oslo_concurrency.processutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbqxoecat" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:17 compute-0 systemd-machined[152794]: New machine qemu-20-instance-0000002a.
Sep 30 21:23:17 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000002a.
Sep 30 21:23:17 compute-0 nova_compute[192810]: 2025-09-30 21:23:17.842 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Successfully created port: b811eba7-d01d-413e-8b04-f60d00b1638e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.644 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267398.6428905, 246eff24-0065-40dd-87cf-c31afb91539f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.644 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Resumed (Lifecycle Event)
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.646 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.647 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.649 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance spawned successfully.
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.650 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.661 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.666 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.669 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.670 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.670 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.670 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.671 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.671 2 DEBUG nova.virt.libvirt.driver [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.694 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.695 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267398.643662, 246eff24-0065-40dd-87cf-c31afb91539f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.695 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Started (Lifecycle Event)
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.746 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.749 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.779 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.825 2 INFO nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Took 2.51 seconds to spawn the instance on the hypervisor.
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.825 2 DEBUG nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.899 2 INFO nova.compute.manager [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Took 3.12 seconds to build instance.
Sep 30 21:23:18 compute-0 nova_compute[192810]: 2025-09-30 21:23:18.922 2 DEBUG oslo_concurrency.lockutils [None req-6c909421-5111-42ef-9d8e-86ffa96fb4d0 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:19 compute-0 ovn_controller[94912]: 2025-09-30T21:23:19Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:5c:86 10.100.0.12
Sep 30 21:23:19 compute-0 ovn_controller[94912]: 2025-09-30T21:23:19Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:5c:86 10.100.0.12
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.768 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Successfully updated port: b811eba7-d01d-413e-8b04-f60d00b1638e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.781 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.782 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquired lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.782 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:19 compute-0 nova_compute[192810]: 2025-09-30 21:23:19.970 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:21 compute-0 podman[226163]: 2025-09-30 21:23:21.33765459 +0000 UTC m=+0.068973158 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.341 2 INFO nova.compute.manager [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Rebuilding instance
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.363 2 DEBUG nova.network.neutron [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updating instance_info_cache with network_info: [{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.397 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Releasing lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.397 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Instance network_info: |[{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.399 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Start _get_guest_xml network_info=[{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.403 2 WARNING nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.407 2 DEBUG nova.virt.libvirt.host [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.409 2 DEBUG nova.virt.libvirt.host [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.415 2 DEBUG nova.virt.libvirt.host [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.416 2 DEBUG nova.virt.libvirt.host [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.418 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.419 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.419 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.419 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.420 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.421 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.421 2 DEBUG nova.virt.hardware [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.423 2 DEBUG nova.virt.libvirt.vif [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-35932657',display_name='tempest-FloatingIPsAssociationTestJSON-server-35932657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-35932657',id=41,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-ir94xn0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:15Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=3585ec48-d3dc-4467-be42-206c7784d55c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.424 2 DEBUG nova.network.os_vif_util [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.424 2 DEBUG nova.network.os_vif_util [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.425 2 DEBUG nova.objects.instance [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'pci_devices' on Instance uuid 3585ec48-d3dc-4467-be42-206c7784d55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.449 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <uuid>3585ec48-d3dc-4467-be42-206c7784d55c</uuid>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <name>instance-00000029</name>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:name>tempest-FloatingIPsAssociationTestJSON-server-35932657</nova:name>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:23:21</nova:creationTime>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:user uuid="73ad6b75271d46c4b9b117faafb3e95e">tempest-FloatingIPsAssociationTestJSON-122452251-project-member</nova:user>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:project uuid="a09110a6c4d740edbafe08d6c1782a1d">tempest-FloatingIPsAssociationTestJSON-122452251</nova:project>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         <nova:port uuid="b811eba7-d01d-413e-8b04-f60d00b1638e">
Sep 30 21:23:21 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <system>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="serial">3585ec48-d3dc-4467-be42-206c7784d55c</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="uuid">3585ec48-d3dc-4467-be42-206c7784d55c</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </system>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <os>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </os>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <features>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </features>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.config"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:39:74:45"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <target dev="tapb811eba7-d0"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/console.log" append="off"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <video>
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </video>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:23:21 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:23:21 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:23:21 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:23:21 compute-0 nova_compute[192810]: </domain>
Sep 30 21:23:21 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.450 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Preparing to wait for external event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.451 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.451 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.451 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.452 2 DEBUG nova.virt.libvirt.vif [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:23:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-35932657',display_name='tempest-FloatingIPsAssociationTestJSON-server-35932657',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-35932657',id=41,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-ir94xn0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:23:15Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=3585ec48-d3dc-4467-be42-206c7784d55c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.452 2 DEBUG nova.network.os_vif_util [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.452 2 DEBUG nova.network.os_vif_util [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.453 2 DEBUG os_vif [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.457 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb811eba7-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb811eba7-d0, col_values=(('external_ids', {'iface-id': 'b811eba7-d01d-413e-8b04-f60d00b1638e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:74:45', 'vm-uuid': '3585ec48-d3dc-4467-be42-206c7784d55c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:21 compute-0 NetworkManager[51733]: <info>  [1759267401.4603] manager: (tapb811eba7-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.466 2 INFO os_vif [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0')
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.534 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.534 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.535 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] No VIF found with MAC fa:16:3e:39:74:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.535 2 INFO nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Using config drive
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.713 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.714 2 DEBUG nova.compute.manager [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing instance network info cache due to event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.714 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.714 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.715 2 DEBUG nova.network.neutron [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.800 2 DEBUG nova.compute.manager [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.875 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'pci_requests' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.891 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'pci_devices' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.919 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'resources' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.941 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'migration_context' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.956 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.959 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.965 2 INFO nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Creating config drive at /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.config
Sep 30 21:23:21 compute-0 nova_compute[192810]: 2025-09-30 21:23:21.971 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpie0zjdg5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.096 2 DEBUG oslo_concurrency.processutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpie0zjdg5" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:22 compute-0 kernel: tapb811eba7-d0: entered promiscuous mode
Sep 30 21:23:22 compute-0 ovn_controller[94912]: 2025-09-30T21:23:22Z|00179|binding|INFO|Claiming lport b811eba7-d01d-413e-8b04-f60d00b1638e for this chassis.
Sep 30 21:23:22 compute-0 NetworkManager[51733]: <info>  [1759267402.1726] manager: (tapb811eba7-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Sep 30 21:23:22 compute-0 ovn_controller[94912]: 2025-09-30T21:23:22Z|00180|binding|INFO|b811eba7-d01d-413e-8b04-f60d00b1638e: Claiming fa:16:3e:39:74:45 10.100.0.7
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.180 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:74:45 10.100.0.7'], port_security=['fa:16:3e:39:74:45 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3585ec48-d3dc-4467-be42-206c7784d55c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-100affc1-3472-41d6-950b-1802616b458e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3501d94-e703-4b91-b4a2-8e0ccdf2600f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be15a9ee-86e7-4792-9a15-1cba7298ecf1, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b811eba7-d01d-413e-8b04-f60d00b1638e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.181 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b811eba7-d01d-413e-8b04-f60d00b1638e in datapath 100affc1-3472-41d6-950b-1802616b458e bound to our chassis
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.182 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 100affc1-3472-41d6-950b-1802616b458e
Sep 30 21:23:22 compute-0 ovn_controller[94912]: 2025-09-30T21:23:22Z|00181|binding|INFO|Setting lport b811eba7-d01d-413e-8b04-f60d00b1638e ovn-installed in OVS
Sep 30 21:23:22 compute-0 ovn_controller[94912]: 2025-09-30T21:23:22Z|00182|binding|INFO|Setting lport b811eba7-d01d-413e-8b04-f60d00b1638e up in Southbound
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.199 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[51150c90-824f-41c0-bcf6-750ed00a44fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 systemd-udevd[226205]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:23:22 compute-0 systemd-machined[152794]: New machine qemu-21-instance-00000029.
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.236 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fc0b08fc-3b49-4a56-9c55-829e5669ad93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000029.
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.241 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[71a8de2e-e2d4-48ae-8863-7beaee3a0bb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 NetworkManager[51733]: <info>  [1759267402.2509] device (tapb811eba7-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:23:22 compute-0 NetworkManager[51733]: <info>  [1759267402.2522] device (tapb811eba7-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.282 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bae92533-6a65-4dc8-8a51-faaa0a8d27dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.316 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c523ff64-3c6c-44b0-8d84-114620c60c5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap100affc1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:7c:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408116, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226216, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.339 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89b7a8a5-58d6-48ad-82cf-88b689bb87f7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap100affc1-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408129, 'tstamp': 408129}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226218, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap100affc1-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408132, 'tstamp': 408132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226218, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.341 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap100affc1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.345 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap100affc1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.345 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.345 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap100affc1-30, col_values=(('external_ids', {'iface-id': '3e3eb48b-c12a-4061-b9d9-c9a7ba21a331'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:22.346 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.922 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267402.9210196, 3585ec48-d3dc-4467-be42-206c7784d55c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.922 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] VM Started (Lifecycle Event)
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.966 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.971 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267402.9211788, 3585ec48-d3dc-4467-be42-206c7784d55c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.972 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] VM Paused (Lifecycle Event)
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.988 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:22 compute-0 nova_compute[192810]: 2025-09-30 21:23:22.991 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:23 compute-0 nova_compute[192810]: 2025-09-30 21:23:23.016 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:23 compute-0 nova_compute[192810]: 2025-09-30 21:23:23.565 2 DEBUG nova.network.neutron [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updated VIF entry in instance network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:23 compute-0 nova_compute[192810]: 2025-09-30 21:23:23.566 2 DEBUG nova.network.neutron [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updating instance_info_cache with network_info: [{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:23 compute-0 nova_compute[192810]: 2025-09-30 21:23:23.585 2 DEBUG oslo_concurrency.lockutils [req-a1eb3105-584b-40a8-be59-2fc8a27fc9fb req-b2b2ef12-527d-4de8-9f8e-ab4ab4f658b9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.176 2 DEBUG nova.compute.manager [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.177 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.177 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.177 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.178 2 DEBUG nova.compute.manager [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Processing event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.178 2 DEBUG nova.compute.manager [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.178 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.179 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.179 2 DEBUG oslo_concurrency.lockutils [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.179 2 DEBUG nova.compute.manager [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] No waiting events found dispatching network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.180 2 WARNING nova.compute.manager [req-afb2dde6-c27f-4205-aefc-686efceea1f6 req-d2b7a06b-bacb-4208-9b41-126d6a6d1e8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received unexpected event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e for instance with vm_state building and task_state spawning.
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.180 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.189 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267404.1890547, 3585ec48-d3dc-4467-be42-206c7784d55c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.190 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] VM Resumed (Lifecycle Event)
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.191 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.196 2 INFO nova.virt.libvirt.driver [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Instance spawned successfully.
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.196 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.222 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.222 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.224 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.225 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.225 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.226 2 DEBUG nova.virt.libvirt.driver [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.229 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.232 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.257 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.292 2 INFO nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Took 8.33 seconds to spawn the instance on the hypervisor.
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.292 2 DEBUG nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:24 compute-0 podman[226227]: 2025-09-30 21:23:24.32628732 +0000 UTC m=+0.057267306 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:23:24 compute-0 podman[226226]: 2025-09-30 21:23:24.334020533 +0000 UTC m=+0.064328703 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.385 2 INFO nova.compute.manager [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Took 8.94 seconds to build instance.
Sep 30 21:23:24 compute-0 nova_compute[192810]: 2025-09-30 21:23:24.411 2 DEBUG oslo_concurrency.lockutils [None req-f8cbbec6-4fe5-4998-b2f8-9e7628d222fe 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:24 compute-0 sshd-session[226268]: Invalid user ubuntu from 45.81.23.80 port 39876
Sep 30 21:23:24 compute-0 sshd-session[226268]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:23:24 compute-0 sshd-session[226268]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:23:26 compute-0 nova_compute[192810]: 2025-09-30 21:23:26.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:26 compute-0 sshd-session[226268]: Failed password for invalid user ubuntu from 45.81.23.80 port 39876 ssh2
Sep 30 21:23:26 compute-0 nova_compute[192810]: 2025-09-30 21:23:26.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:26 compute-0 NetworkManager[51733]: <info>  [1759267406.6848] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Sep 30 21:23:26 compute-0 NetworkManager[51733]: <info>  [1759267406.6854] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Sep 30 21:23:26 compute-0 nova_compute[192810]: 2025-09-30 21:23:26.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:26 compute-0 ovn_controller[94912]: 2025-09-30T21:23:26Z|00183|binding|INFO|Releasing lport 3e3eb48b-c12a-4061-b9d9-c9a7ba21a331 from this chassis (sb_readonly=0)
Sep 30 21:23:26 compute-0 nova_compute[192810]: 2025-09-30 21:23:26.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:26 compute-0 nova_compute[192810]: 2025-09-30 21:23:26.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:27 compute-0 sshd-session[226268]: Received disconnect from 45.81.23.80 port 39876:11: Bye Bye [preauth]
Sep 30 21:23:27 compute-0 sshd-session[226268]: Disconnected from invalid user ubuntu 45.81.23.80 port 39876 [preauth]
Sep 30 21:23:28 compute-0 nova_compute[192810]: 2025-09-30 21:23:28.339 2 DEBUG nova.compute.manager [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-changed-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:28 compute-0 nova_compute[192810]: 2025-09-30 21:23:28.339 2 DEBUG nova.compute.manager [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing instance network info cache due to event network-changed-bea64634-2634-4763-abcc-3935baa9761c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:28 compute-0 nova_compute[192810]: 2025-09-30 21:23:28.340 2 DEBUG oslo_concurrency.lockutils [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:28 compute-0 nova_compute[192810]: 2025-09-30 21:23:28.340 2 DEBUG oslo_concurrency.lockutils [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:28 compute-0 nova_compute[192810]: 2025-09-30 21:23:28.341 2 DEBUG nova.network.neutron [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing network info cache for port bea64634-2634-4763-abcc-3935baa9761c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:29 compute-0 nova_compute[192810]: 2025-09-30 21:23:29.847 2 DEBUG nova.network.neutron [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated VIF entry in instance network info cache for port bea64634-2634-4763-abcc-3935baa9761c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:29 compute-0 nova_compute[192810]: 2025-09-30 21:23:29.848 2 DEBUG nova.network.neutron [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:29 compute-0 nova_compute[192810]: 2025-09-30 21:23:29.871 2 DEBUG oslo_concurrency.lockutils [req-2c0ed21a-7d8c-4b57-9cbb-051870065aa9 req-f44f60d1-1d2d-4ed3-8e44-72e4c2a00d6e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:30 compute-0 nova_compute[192810]: 2025-09-30 21:23:30.426 2 DEBUG nova.compute.manager [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-changed-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:30 compute-0 nova_compute[192810]: 2025-09-30 21:23:30.427 2 DEBUG nova.compute.manager [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing instance network info cache due to event network-changed-bea64634-2634-4763-abcc-3935baa9761c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:30 compute-0 nova_compute[192810]: 2025-09-30 21:23:30.427 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:30 compute-0 nova_compute[192810]: 2025-09-30 21:23:30.428 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:30 compute-0 nova_compute[192810]: 2025-09-30 21:23:30.428 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing network info cache for port bea64634-2634-4763-abcc-3935baa9761c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:31 compute-0 nova_compute[192810]: 2025-09-30 21:23:31.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:31 compute-0 nova_compute[192810]: 2025-09-30 21:23:31.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:32 compute-0 nova_compute[192810]: 2025-09-30 21:23:32.057 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:23:32 compute-0 podman[226282]: 2025-09-30 21:23:32.328411245 +0000 UTC m=+0.062977059 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:23:32 compute-0 podman[226281]: 2025-09-30 21:23:32.38763289 +0000 UTC m=+0.116187604 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.090 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated VIF entry in instance network info cache for port bea64634-2634-4763-abcc-3935baa9761c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.091 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.115 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.115 2 DEBUG nova.compute.manager [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.116 2 DEBUG nova.compute.manager [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing instance network info cache due to event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.116 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.116 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:33 compute-0 nova_compute[192810]: 2025-09-30 21:23:33.117 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:34 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Sep 30 21:23:34 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002a.scope: Consumed 12.518s CPU time.
Sep 30 21:23:34 compute-0 systemd-machined[152794]: Machine qemu-20-instance-0000002a terminated.
Sep 30 21:23:34 compute-0 ovn_controller[94912]: 2025-09-30T21:23:34Z|00184|binding|INFO|Releasing lport 3e3eb48b-c12a-4061-b9d9-c9a7ba21a331 from this chassis (sb_readonly=0)
Sep 30 21:23:34 compute-0 nova_compute[192810]: 2025-09-30 21:23:34.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.071 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance shutdown successfully after 13 seconds.
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.081 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance destroyed successfully.
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.088 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance destroyed successfully.
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.089 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deleting instance files /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.090 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deletion of /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del complete
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.323 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.324 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating image(s)
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.325 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.326 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.327 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.360 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.440 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.441 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.442 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.458 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:35.468 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:35.469 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.522 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.523 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.554 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updated VIF entry in instance network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.555 2 DEBUG nova.network.neutron [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updating instance_info_cache with network_info: [{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.558 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.558 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.558 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.595 2 DEBUG oslo_concurrency.lockutils [req-56bbd2bb-e7b7-4281-9a62-f76e14a6ceae req-351f948e-1fe4-4466-a28e-5b0e347811cf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.613 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.614 2 DEBUG nova.virt.disk.api [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Checking if we can resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.614 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.670 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.672 2 DEBUG nova.virt.disk.api [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Cannot resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.672 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.673 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Ensure instance console log exists: /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.673 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.674 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.674 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.677 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.682 2 WARNING nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.691 2 DEBUG nova.virt.libvirt.host [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.692 2 DEBUG nova.virt.libvirt.host [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.696 2 DEBUG nova.virt.libvirt.host [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.697 2 DEBUG nova.virt.libvirt.host [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.700 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.700 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.701 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.702 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.703 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.703 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.703 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.704 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.705 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.705 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.706 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.706 2 DEBUG nova.virt.hardware [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.707 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'vcpu_model' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.733 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <uuid>246eff24-0065-40dd-87cf-c31afb91539f</uuid>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <name>instance-0000002a</name>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdmin275Test-server-1639195055</nova:name>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:23:35</nova:creationTime>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:user uuid="80508b28c98a49d09d4a47dd6403e6fe">tempest-ServersAdmin275Test-674052928-project-member</nova:user>
Sep 30 21:23:35 compute-0 nova_compute[192810]:         <nova:project uuid="a5f7cbca41c24c7fac661cac94fdf2aa">tempest-ServersAdmin275Test-674052928</nova:project>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <system>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="serial">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="uuid">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </system>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <os>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </os>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <features>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </features>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log" append="off"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <video>
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </video>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:23:35 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:23:35 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:23:35 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:23:35 compute-0 nova_compute[192810]: </domain>
Sep 30 21:23:35 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.783 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.783 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.784 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Using config drive
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.799 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'ec2_ids' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:35 compute-0 nova_compute[192810]: 2025-09-30 21:23:35.827 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'keypairs' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.240 2 INFO nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating config drive at /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.244 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyloskhat execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.382 2 DEBUG oslo_concurrency.processutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpyloskhat" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:36 compute-0 ovn_controller[94912]: 2025-09-30T21:23:36Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:74:45 10.100.0.7
Sep 30 21:23:36 compute-0 ovn_controller[94912]: 2025-09-30T21:23:36Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:74:45 10.100.0.7
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:36 compute-0 systemd-machined[152794]: New machine qemu-22-instance-0000002a.
Sep 30 21:23:36 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000002a.
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:36 compute-0 nova_compute[192810]: 2025-09-30 21:23:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.334 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 246eff24-0065-40dd-87cf-c31afb91539f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.335 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267417.33409, 246eff24-0065-40dd-87cf-c31afb91539f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.336 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Resumed (Lifecycle Event)
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.341 2 DEBUG nova.compute.manager [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.342 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.347 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance spawned successfully.
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.347 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.378 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.383 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.384 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.385 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.385 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.386 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.387 2 DEBUG nova.virt.libvirt.driver [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.394 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.437 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.438 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267417.3354409, 246eff24-0065-40dd-87cf-c31afb91539f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.438 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Started (Lifecycle Event)
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.461 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.465 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.487 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.496 2 DEBUG nova.compute.manager [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.599 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.600 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.600 2 DEBUG nova.objects.instance [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:23:37 compute-0 nova_compute[192810]: 2025-09-30 21:23:37.683 2 DEBUG oslo_concurrency.lockutils [None req-2a638904-d539-40b6-98fd-e0ad6d11c5fb 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.083s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:38 compute-0 podman[226390]: 2025-09-30 21:23:38.349022825 +0000 UTC m=+0.077889171 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:23:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:38.727 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:38.728 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:38.729 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:39 compute-0 nova_compute[192810]: 2025-09-30 21:23:39.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:39 compute-0 nova_compute[192810]: 2025-09-30 21:23:39.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.251 2 INFO nova.compute.manager [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Rebuilding instance
Sep 30 21:23:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:40.470 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.591 2 DEBUG nova.compute.manager [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.685 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'pci_requests' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.696 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'pci_devices' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.711 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'resources' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.723 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'migration_context' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.734 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.737 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:40 compute-0 nova_compute[192810]: 2025-09-30 21:23:40.808 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.483 2 DEBUG nova.compute.manager [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.483 2 DEBUG nova.compute.manager [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing instance network info cache due to event network-changed-b811eba7-d01d-413e-8b04-f60d00b1638e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.483 2 DEBUG oslo_concurrency.lockutils [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.483 2 DEBUG oslo_concurrency.lockutils [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.484 2 DEBUG nova.network.neutron [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Refreshing network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.828 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.829 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.829 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.829 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.919 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.996 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:41 compute-0 nova_compute[192810]: 2025-09-30 21:23:41.998 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.053 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.062 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.120 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.121 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.178 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.185 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.255 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.256 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.313 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.537 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.539 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5255MB free_disk=73.36995697021484GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.540 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.541 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.614 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 5da25813-a9d3-49a0-80ae-59c06fef8440 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.614 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 3585ec48-d3dc-4467-be42-206c7784d55c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.615 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 246eff24-0065-40dd-87cf-c31afb91539f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.615 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.616 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.702 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.717 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.742 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:23:42 compute-0 nova_compute[192810]: 2025-09-30 21:23:42.742 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.401 2 DEBUG nova.network.neutron [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updated VIF entry in instance network info cache for port b811eba7-d01d-413e-8b04-f60d00b1638e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.402 2 DEBUG nova.network.neutron [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updating instance_info_cache with network_info: [{"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.418 2 DEBUG oslo_concurrency.lockutils [req-71943b38-6fcf-48e3-814b-55cc00fae2f2 req-24ae1db1-3c4e-4429-8d9e-8528e1410273 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3585ec48-d3dc-4467-be42-206c7784d55c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.597 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.598 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.599 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.599 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.600 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.623 2 INFO nova.compute.manager [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Terminating instance
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.645 2 DEBUG nova.compute.manager [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:43 compute-0 kernel: tapb811eba7-d0 (unregistering): left promiscuous mode
Sep 30 21:23:43 compute-0 NetworkManager[51733]: <info>  [1759267423.6759] device (tapb811eba7-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.739 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:43 compute-0 ovn_controller[94912]: 2025-09-30T21:23:43Z|00185|binding|INFO|Releasing lport b811eba7-d01d-413e-8b04-f60d00b1638e from this chassis (sb_readonly=0)
Sep 30 21:23:43 compute-0 ovn_controller[94912]: 2025-09-30T21:23:43Z|00186|binding|INFO|Setting lport b811eba7-d01d-413e-8b04-f60d00b1638e down in Southbound
Sep 30 21:23:43 compute-0 ovn_controller[94912]: 2025-09-30T21:23:43Z|00187|binding|INFO|Removing iface tapb811eba7-d0 ovn-installed in OVS
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.774 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:74:45 10.100.0.7'], port_security=['fa:16:3e:39:74:45 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3585ec48-d3dc-4467-be42-206c7784d55c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-100affc1-3472-41d6-950b-1802616b458e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3501d94-e703-4b91-b4a2-8e0ccdf2600f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be15a9ee-86e7-4792-9a15-1cba7298ecf1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b811eba7-d01d-413e-8b04-f60d00b1638e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.775 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b811eba7-d01d-413e-8b04-f60d00b1638e in datapath 100affc1-3472-41d6-950b-1802616b458e unbound from our chassis
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.777 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 100affc1-3472-41d6-950b-1802616b458e
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.793 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[25cc03c3-5676-45ed-8d9a-5e51898d089c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000029.scope: Deactivated successfully.
Sep 30 21:23:43 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000029.scope: Consumed 11.810s CPU time.
Sep 30 21:23:43 compute-0 systemd-machined[152794]: Machine qemu-21-instance-00000029 terminated.
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.823 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f3db0b3c-1ce0-4b56-ab3b-ad2c4df914f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.827 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b981bf-3f00-4663-9adf-8273f2e9b7d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.855 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[75c5e485-150a-4763-8903-16f6bce888e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.881 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a55237-f1ba-4e59-acc4-1c5cd29148ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap100affc1-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:7c:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408116, 'reachable_time': 18410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226441, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.905 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f6c93e-f76a-45a1-98f4-38f10db38f65]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap100affc1-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408129, 'tstamp': 408129}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226453, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap100affc1-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408132, 'tstamp': 408132}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226453, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.908 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap100affc1-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.917 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap100affc1-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.918 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.918 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap100affc1-30, col_values=(('external_ids', {'iface-id': '3e3eb48b-c12a-4061-b9d9-c9a7ba21a331'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:43.918 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.920 2 INFO nova.virt.libvirt.driver [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Instance destroyed successfully.
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.921 2 DEBUG nova.objects.instance [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'resources' on Instance uuid 3585ec48-d3dc-4467-be42-206c7784d55c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.937 2 DEBUG nova.virt.libvirt.vif [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-35932657',display_name='tempest-FloatingIPsAssociationTestJSON-server-35932657',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-35932657',id=41,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-ir94xn0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:24Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=3585ec48-d3dc-4467-be42-206c7784d55c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.938 2 DEBUG nova.network.os_vif_util [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "b811eba7-d01d-413e-8b04-f60d00b1638e", "address": "fa:16:3e:39:74:45", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb811eba7-d0", "ovs_interfaceid": "b811eba7-d01d-413e-8b04-f60d00b1638e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.938 2 DEBUG nova.network.os_vif_util [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.939 2 DEBUG os_vif [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.941 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb811eba7-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.946 2 INFO os_vif [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:74:45,bridge_name='br-int',has_traffic_filtering=True,id=b811eba7-d01d-413e-8b04-f60d00b1638e,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb811eba7-d0')
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.947 2 INFO nova.virt.libvirt.driver [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Deleting instance files /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c_del
Sep 30 21:23:43 compute-0 nova_compute[192810]: 2025-09-30 21:23:43.948 2 INFO nova.virt.libvirt.driver [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Deletion of /var/lib/nova/instances/3585ec48-d3dc-4467-be42-206c7784d55c_del complete
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.020 2 INFO nova.compute.manager [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.021 2 DEBUG oslo.service.loopingcall [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.021 2 DEBUG nova.compute.manager [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.021 2 DEBUG nova.network.neutron [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.779 2 DEBUG nova.compute.manager [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-unplugged-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.780 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.781 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.781 2 DEBUG oslo_concurrency.lockutils [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.781 2 DEBUG nova.compute.manager [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] No waiting events found dispatching network-vif-unplugged-b811eba7-d01d-413e-8b04-f60d00b1638e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.782 2 DEBUG nova.compute.manager [req-38c207b4-74f8-4966-90c6-50370eacfe51 req-d724496f-14c9-4383-a748-f485624cabb0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-unplugged-b811eba7-d01d-413e-8b04-f60d00b1638e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:23:44 compute-0 nova_compute[192810]: 2025-09-30 21:23:44.808 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.039 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.039 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.040 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.040 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5da25813-a9d3-49a0-80ae-59c06fef8440 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.123 2 DEBUG nova.network.neutron [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.150 2 INFO nova.compute.manager [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Took 1.13 seconds to deallocate network for instance.
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.238 2 DEBUG nova.compute.manager [req-f986ccf4-dfc2-4195-9d77-4799bb590806 req-f313317b-cc5f-4677-b830-30fa4a85aaa5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-deleted-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.274 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.274 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:45 compute-0 podman[226460]: 2025-09-30 21:23:45.342575157 +0000 UTC m=+0.071068492 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:23:45 compute-0 podman[226459]: 2025-09-30 21:23:45.345274964 +0000 UTC m=+0.084787092 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.431 2 DEBUG nova.compute.provider_tree [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.450 2 DEBUG nova.scheduler.client.report [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.479 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.503 2 INFO nova.scheduler.client.report [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Deleted allocations for instance 3585ec48-d3dc-4467-be42-206c7784d55c
Sep 30 21:23:45 compute-0 nova_compute[192810]: 2025-09-30 21:23:45.608 2 DEBUG oslo_concurrency.lockutils [None req-87216d43-a65d-421a-9a5a-a92c53a01692 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.904 2 DEBUG nova.compute.manager [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.905 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.905 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.905 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3585ec48-d3dc-4467-be42-206c7784d55c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.906 2 DEBUG nova.compute.manager [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] No waiting events found dispatching network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.906 2 WARNING nova.compute.manager [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Received unexpected event network-vif-plugged-b811eba7-d01d-413e-8b04-f60d00b1638e for instance with vm_state deleted and task_state None.
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.906 2 DEBUG nova.compute.manager [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-changed-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.906 2 DEBUG nova.compute.manager [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing instance network info cache due to event network-changed-bea64634-2634-4763-abcc-3935baa9761c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:46 compute-0 nova_compute[192810]: 2025-09-30 21:23:46.906 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.415 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.444 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.444 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.445 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.445 2 DEBUG nova.network.neutron [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing network info cache for port bea64634-2634-4763-abcc-3935baa9761c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.448 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:47 compute-0 nova_compute[192810]: 2025-09-30 21:23:47.449 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:23:48 compute-0 nova_compute[192810]: 2025-09-30 21:23:48.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:50 compute-0 nova_compute[192810]: 2025-09-30 21:23:50.491 2 DEBUG nova.network.neutron [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated VIF entry in instance network info cache for port bea64634-2634-4763-abcc-3935baa9761c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:50 compute-0 nova_compute[192810]: 2025-09-30 21:23:50.492 2 DEBUG nova.network.neutron [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:50 compute-0 nova_compute[192810]: 2025-09-30 21:23:50.528 2 DEBUG oslo_concurrency.lockutils [req-9f1afe25-86b6-4679-9500-479f95c29e53 req-88741ceb-ed5b-483d-8cec-500acbb7e914 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:50 compute-0 nova_compute[192810]: 2025-09-30 21:23:50.781 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:23:51 compute-0 nova_compute[192810]: 2025-09-30 21:23:51.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:51 compute-0 nova_compute[192810]: 2025-09-30 21:23:51.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:52 compute-0 podman[226524]: 2025-09-30 21:23:52.349798227 +0000 UTC m=+0.078544377 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid)
Sep 30 21:23:52 compute-0 nova_compute[192810]: 2025-09-30 21:23:52.714 2 DEBUG nova.compute.manager [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-changed-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:52 compute-0 nova_compute[192810]: 2025-09-30 21:23:52.715 2 DEBUG nova.compute.manager [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing instance network info cache due to event network-changed-bea64634-2634-4763-abcc-3935baa9761c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:23:52 compute-0 nova_compute[192810]: 2025-09-30 21:23:52.715 2 DEBUG oslo_concurrency.lockutils [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:52 compute-0 nova_compute[192810]: 2025-09-30 21:23:52.715 2 DEBUG oslo_concurrency.lockutils [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:52 compute-0 nova_compute[192810]: 2025-09-30 21:23:52.715 2 DEBUG nova.network.neutron [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Refreshing network info cache for port bea64634-2634-4763-abcc-3935baa9761c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:23:52 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Sep 30 21:23:52 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002a.scope: Consumed 12.143s CPU time.
Sep 30 21:23:52 compute-0 systemd-machined[152794]: Machine qemu-22-instance-0000002a terminated.
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.800 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance shutdown successfully after 13 seconds.
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.808 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance destroyed successfully.
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.816 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance destroyed successfully.
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.816 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deleting instance files /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.817 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deletion of /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del complete
Sep 30 21:23:53 compute-0 nova_compute[192810]: 2025-09-30 21:23:53.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.043 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.044 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating image(s)
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.045 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Acquiring lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.046 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.047 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.074 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.151 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.154 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.155 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.171 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.240 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.241 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.277 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.278 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.279 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.368 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.369 2 DEBUG nova.virt.disk.api [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Checking if we can resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.369 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.437 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.438 2 DEBUG nova.virt.disk.api [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Cannot resize image /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.438 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.439 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Ensure instance console log exists: /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.439 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.439 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.440 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.441 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.445 2 WARNING nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.450 2 DEBUG nova.virt.libvirt.host [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.451 2 DEBUG nova.virt.libvirt.host [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.453 2 DEBUG nova.virt.libvirt.host [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.454 2 DEBUG nova.virt.libvirt.host [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.455 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.455 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.455 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.456 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.456 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.456 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.456 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.456 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.457 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.457 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.457 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.457 2 DEBUG nova.virt.hardware [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.458 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.485 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <uuid>246eff24-0065-40dd-87cf-c31afb91539f</uuid>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <name>instance-0000002a</name>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAdmin275Test-server-1639195055</nova:name>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:23:54</nova:creationTime>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:user uuid="80508b28c98a49d09d4a47dd6403e6fe">tempest-ServersAdmin275Test-674052928-project-member</nova:user>
Sep 30 21:23:54 compute-0 nova_compute[192810]:         <nova:project uuid="a5f7cbca41c24c7fac661cac94fdf2aa">tempest-ServersAdmin275Test-674052928</nova:project>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <system>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="serial">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="uuid">246eff24-0065-40dd-87cf-c31afb91539f</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </system>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <os>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </os>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <features>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </features>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/console.log" append="off"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <video>
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </video>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:23:54 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:23:54 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:23:54 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:23:54 compute-0 nova_compute[192810]: </domain>
Sep 30 21:23:54 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.553 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.554 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.555 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Using config drive
Sep 30 21:23:54 compute-0 podman[226571]: 2025-09-30 21:23:54.58128509 +0000 UTC m=+0.058429919 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.585 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:54 compute-0 podman[226570]: 2025-09-30 21:23:54.586438308 +0000 UTC m=+0.070589201 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.639 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lazy-loading 'keypairs' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.890 2 DEBUG nova.network.neutron [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updated VIF entry in instance network info cache for port bea64634-2634-4763-abcc-3935baa9761c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.891 2 DEBUG nova.network.neutron [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [{"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.908 2 DEBUG oslo_concurrency.lockutils [req-af0433b6-2d38-4e7f-860f-2d9ed513d8e2 req-9d7b8f66-d310-418c-8937-83acba18488a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-5da25813-a9d3-49a0-80ae-59c06fef8440" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.942 2 INFO nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Creating config drive at /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config
Sep 30 21:23:54 compute-0 nova_compute[192810]: 2025-09-30 21:23:54.947 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdi2oesfx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:23:55 compute-0 nova_compute[192810]: 2025-09-30 21:23:55.077 2 DEBUG oslo_concurrency.processutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdi2oesfx" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:23:55 compute-0 systemd-machined[152794]: New machine qemu-23-instance-0000002a.
Sep 30 21:23:55 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000002a.
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.182 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 246eff24-0065-40dd-87cf-c31afb91539f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.183 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267436.1817608, 246eff24-0065-40dd-87cf-c31afb91539f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.184 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Resumed (Lifecycle Event)
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.189 2 DEBUG nova.compute.manager [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.190 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.196 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance spawned successfully.
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.197 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.204 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.210 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.224 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.225 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.226 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.226 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.227 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.228 2 DEBUG nova.virt.libvirt.driver [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.234 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.234 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267436.1835837, 246eff24-0065-40dd-87cf-c31afb91539f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.235 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Started (Lifecycle Event)
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.259 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.262 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.288 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.309 2 DEBUG nova.compute.manager [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.425 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.425 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.425 2 DEBUG nova.objects.instance [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.536 2 DEBUG oslo_concurrency.lockutils [None req-8f1bac02-69f4-47c7-a3a2-582f09ffdfb1 39089d7d23be4b84a851410de37169b3 3aa0dd00b917459892fc86f5f08cb571 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.974 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "246eff24-0065-40dd-87cf-c31afb91539f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.975 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.976 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "246eff24-0065-40dd-87cf-c31afb91539f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.977 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.978 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:56 compute-0 nova_compute[192810]: 2025-09-30 21:23:56.998 2 INFO nova.compute.manager [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Terminating instance
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.017 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "refresh_cache-246eff24-0065-40dd-87cf-c31afb91539f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.018 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquired lock "refresh_cache-246eff24-0065-40dd-87cf-c31afb91539f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.018 2 DEBUG nova.network.neutron [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.085 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.086 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.087 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.087 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.087 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.102 2 INFO nova.compute.manager [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Terminating instance
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.117 2 DEBUG nova.compute.manager [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:57 compute-0 kernel: tapbea64634-26 (unregistering): left promiscuous mode
Sep 30 21:23:57 compute-0 NetworkManager[51733]: <info>  [1759267437.1443] device (tapbea64634-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 ovn_controller[94912]: 2025-09-30T21:23:57Z|00188|binding|INFO|Releasing lport bea64634-2634-4763-abcc-3935baa9761c from this chassis (sb_readonly=0)
Sep 30 21:23:57 compute-0 ovn_controller[94912]: 2025-09-30T21:23:57Z|00189|binding|INFO|Setting lport bea64634-2634-4763-abcc-3935baa9761c down in Southbound
Sep 30 21:23:57 compute-0 ovn_controller[94912]: 2025-09-30T21:23:57Z|00190|binding|INFO|Removing iface tapbea64634-26 ovn-installed in OVS
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.194 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:5c:86 10.100.0.12'], port_security=['fa:16:3e:1c:5c:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5da25813-a9d3-49a0-80ae-59c06fef8440', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-100affc1-3472-41d6-950b-1802616b458e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a09110a6c4d740edbafe08d6c1782a1d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c3501d94-e703-4b91-b4a2-8e0ccdf2600f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be15a9ee-86e7-4792-9a15-1cba7298ecf1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=bea64634-2634-4763-abcc-3935baa9761c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.195 103867 INFO neutron.agent.ovn.metadata.agent [-] Port bea64634-2634-4763-abcc-3935baa9761c in datapath 100affc1-3472-41d6-950b-1802616b458e unbound from our chassis
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.196 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 100affc1-3472-41d6-950b-1802616b458e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.198 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[15b5baba-1444-435f-bdc3-a0438a261b7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.198 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-100affc1-3472-41d6-950b-1802616b458e namespace which is not needed anymore
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000026.scope: Deactivated successfully.
Sep 30 21:23:57 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000026.scope: Consumed 15.083s CPU time.
Sep 30 21:23:57 compute-0 systemd-machined[152794]: Machine qemu-19-instance-00000026 terminated.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.324 2 DEBUG nova.network.neutron [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:57 compute-0 NetworkManager[51733]: <info>  [1759267437.3344] manager: (tapbea64634-26): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [NOTICE]   (226009) : haproxy version is 2.8.14-c23fe91
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [NOTICE]   (226009) : path to executable is /usr/sbin/haproxy
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [WARNING]  (226009) : Exiting Master process...
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [WARNING]  (226009) : Exiting Master process...
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [ALERT]    (226009) : Current worker (226011) exited with code 143 (Terminated)
Sep 30 21:23:57 compute-0 neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e[226005]: [WARNING]  (226009) : All workers exited. Exiting... (0)
Sep 30 21:23:57 compute-0 systemd[1]: libpod-c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694.scope: Deactivated successfully.
Sep 30 21:23:57 compute-0 podman[226662]: 2025-09-30 21:23:57.366453583 +0000 UTC m=+0.069301179 container died c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.384 2 INFO nova.virt.libvirt.driver [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Instance destroyed successfully.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.385 2 DEBUG nova.objects.instance [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lazy-loading 'resources' on Instance uuid 5da25813-a9d3-49a0-80ae-59c06fef8440 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694-userdata-shm.mount: Deactivated successfully.
Sep 30 21:23:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-55f924ecda93e0cae2f7ba8f561ba79e6d9778c1e8ac721ef67b713952bf2f88-merged.mount: Deactivated successfully.
Sep 30 21:23:57 compute-0 podman[226662]: 2025-09-30 21:23:57.421694691 +0000 UTC m=+0.124542287 container cleanup c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.427 2 DEBUG nova.virt.libvirt.vif [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-FloatingIPsAssociationTestJSON-server-1607988543',display_name='tempest-FloatingIPsAssociationTestJSON-server-1607988543',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-floatingipsassociationtestjson-server-1607988543',id=38,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:23:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a09110a6c4d740edbafe08d6c1782a1d',ramdisk_id='',reservation_id='r-7how27dd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-FloatingIPsAssociationTestJSON-122452251',owner_user_name='tempest-FloatingIPsAssociationTestJSON-122452251-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:23:07Z,user_data=None,user_id='73ad6b75271d46c4b9b117faafb3e95e',uuid=5da25813-a9d3-49a0-80ae-59c06fef8440,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.427 2 DEBUG nova.network.os_vif_util [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converting VIF {"id": "bea64634-2634-4763-abcc-3935baa9761c", "address": "fa:16:3e:1c:5c:86", "network": {"id": "100affc1-3472-41d6-950b-1802616b458e", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-75147163-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a09110a6c4d740edbafe08d6c1782a1d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbea64634-26", "ovs_interfaceid": "bea64634-2634-4763-abcc-3935baa9761c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.429 2 DEBUG nova.network.os_vif_util [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.429 2 DEBUG os_vif [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbea64634-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.440 2 INFO os_vif [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:5c:86,bridge_name='br-int',has_traffic_filtering=True,id=bea64634-2634-4763-abcc-3935baa9761c,network=Network(100affc1-3472-41d6-950b-1802616b458e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbea64634-26')
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.441 2 INFO nova.virt.libvirt.driver [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Deleting instance files /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440_del
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.442 2 INFO nova.virt.libvirt.driver [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Deletion of /var/lib/nova/instances/5da25813-a9d3-49a0-80ae-59c06fef8440_del complete
Sep 30 21:23:57 compute-0 systemd[1]: libpod-conmon-c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694.scope: Deactivated successfully.
Sep 30 21:23:57 compute-0 podman[226705]: 2025-09-30 21:23:57.497310035 +0000 UTC m=+0.039244053 container remove c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.504 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[43c7a3cc-3c13-459e-8ba0-b8bed7032b7e]: (4, ('Tue Sep 30 09:23:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e (c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694)\nc48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694\nTue Sep 30 09:23:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-100affc1-3472-41d6-950b-1802616b458e (c48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694)\nc48f383bf11af847b4a303d25253a02a1da06b49a8d80624924b1345c7eaa694\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.506 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bc77cfc3-7cf3-42df-9a0e-a02844b115b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.508 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap100affc1-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.509 2 DEBUG nova.compute.manager [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-unplugged-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.510 2 DEBUG oslo_concurrency.lockutils [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.510 2 DEBUG oslo_concurrency.lockutils [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.511 2 DEBUG oslo_concurrency.lockutils [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:57 compute-0 kernel: tap100affc1-30: left promiscuous mode
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.511 2 DEBUG nova.compute.manager [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] No waiting events found dispatching network-vif-unplugged-bea64634-2634-4763-abcc-3935baa9761c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.512 2 DEBUG nova.compute.manager [req-94f5beca-c0eb-473d-acb6-f2990cb76f17 req-2ec6d23f-c39e-472c-b63b-b25063b46d43 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-unplugged-bea64634-2634-4763-abcc-3935baa9761c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.516 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[36ce9599-d22f-4142-ac29-ebc00a39323d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.534 2 INFO nova.compute.manager [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.534 2 DEBUG oslo.service.loopingcall [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.535 2 DEBUG nova.compute.manager [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.535 2 DEBUG nova.network.neutron [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.553 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a31c74d6-b1b8-4d79-a65c-2bb23e7f2e93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.555 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[09846b38-8a27-4d58-95f3-888efe458673]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.569 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1402db67-c2e6-443c-8ba1-e001cf7245e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408108, 'reachable_time': 23045, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226722, 'error': None, 'target': 'ovnmeta-100affc1-3472-41d6-950b-1802616b458e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d100affc1\x2d3472\x2d41d6\x2d950b\x2d1802616b458e.mount: Deactivated successfully.
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.574 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-100affc1-3472-41d6-950b-1802616b458e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:23:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:23:57.574 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[01df0221-6090-4bc8-a724-467d1c55c414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.606 2 DEBUG nova.network.neutron [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.636 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Releasing lock "refresh_cache-246eff24-0065-40dd-87cf-c31afb91539f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.637 2 DEBUG nova.compute.manager [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:23:57 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Sep 30 21:23:57 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002a.scope: Consumed 2.363s CPU time.
Sep 30 21:23:57 compute-0 systemd-machined[152794]: Machine qemu-23-instance-0000002a terminated.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.886 2 INFO nova.virt.libvirt.driver [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance destroyed successfully.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.886 2 DEBUG nova.objects.instance [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lazy-loading 'resources' on Instance uuid 246eff24-0065-40dd-87cf-c31afb91539f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.907 2 INFO nova.virt.libvirt.driver [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deleting instance files /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.908 2 INFO nova.virt.libvirt.driver [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deletion of /var/lib/nova/instances/246eff24-0065-40dd-87cf-c31afb91539f_del complete
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.994 2 INFO nova.compute.manager [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.995 2 DEBUG oslo.service.loopingcall [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.995 2 DEBUG nova.compute.manager [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:23:57 compute-0 nova_compute[192810]: 2025-09-30 21:23:57.996 2 DEBUG nova.network.neutron [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.156 2 DEBUG nova.network.neutron [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.171 2 DEBUG nova.network.neutron [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.184 2 INFO nova.compute.manager [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Took 0.19 seconds to deallocate network for instance.
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.250 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.251 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.325 2 DEBUG nova.compute.provider_tree [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.339 2 DEBUG nova.scheduler.client.report [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.359 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.428 2 DEBUG nova.network.neutron [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.563 2 INFO nova.compute.manager [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Took 1.03 seconds to deallocate network for instance.
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.586 2 INFO nova.scheduler.client.report [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Deleted allocations for instance 246eff24-0065-40dd-87cf-c31afb91539f
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.627 2 DEBUG nova.compute.manager [req-dd259228-88d3-4bab-8631-87a274998174 req-2882e31d-cdb0-48db-af3a-51f2a55b6e97 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-deleted-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.660 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.661 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.681 2 DEBUG oslo_concurrency.lockutils [None req-e5738022-0338-49cb-bc48-f30fc959d16b 80508b28c98a49d09d4a47dd6403e6fe a5f7cbca41c24c7fac661cac94fdf2aa - - default default] Lock "246eff24-0065-40dd-87cf-c31afb91539f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.705 2 DEBUG nova.compute.provider_tree [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.743 2 DEBUG nova.scheduler.client.report [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.784 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.837 2 INFO nova.scheduler.client.report [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Deleted allocations for instance 5da25813-a9d3-49a0-80ae-59c06fef8440
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.904 2 DEBUG oslo_concurrency.lockutils [None req-4f5431ce-1bc7-4d7d-8c4f-10c27079fac3 73ad6b75271d46c4b9b117faafb3e95e a09110a6c4d740edbafe08d6c1782a1d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.914 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267423.913774, 3585ec48-d3dc-4467-be42-206c7784d55c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.915 2 INFO nova.compute.manager [-] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] VM Stopped (Lifecycle Event)
Sep 30 21:23:58 compute-0 nova_compute[192810]: 2025-09-30 21:23:58.933 2 DEBUG nova.compute.manager [None req-7cb8ae3f-b08a-41a1-bf18-8a4b71e6deaf - - - - - -] [instance: 3585ec48-d3dc-4467-be42-206c7784d55c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:23:59 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 225036
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.627 2 DEBUG nova.compute.manager [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.627 2 DEBUG oslo_concurrency.lockutils [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.627 2 DEBUG oslo_concurrency.lockutils [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.628 2 DEBUG oslo_concurrency.lockutils [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "5da25813-a9d3-49a0-80ae-59c06fef8440-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.628 2 DEBUG nova.compute.manager [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] No waiting events found dispatching network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:23:59 compute-0 nova_compute[192810]: 2025-09-30 21:23:59.628 2 WARNING nova.compute.manager [req-d356aee1-09c5-4e24-b270-16ebe922e0a6 req-aab4a9fb-da07-46f8-beea-7ef9fe452bcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Received unexpected event network-vif-plugged-bea64634-2634-4763-abcc-3935baa9761c for instance with vm_state deleted and task_state None.
Sep 30 21:24:00 compute-0 nova_compute[192810]: 2025-09-30 21:24:00.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:01 compute-0 nova_compute[192810]: 2025-09-30 21:24:01.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:01 compute-0 nova_compute[192810]: 2025-09-30 21:24:01.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:02 compute-0 nova_compute[192810]: 2025-09-30 21:24:02.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:03 compute-0 podman[226734]: 2025-09-30 21:24:03.341257201 +0000 UTC m=+0.072623700 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:24:03 compute-0 podman[226733]: 2025-09-30 21:24:03.420464024 +0000 UTC m=+0.159442432 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:24:06 compute-0 nova_compute[192810]: 2025-09-30 21:24:06.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:07 compute-0 nova_compute[192810]: 2025-09-30 21:24:07.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:09 compute-0 podman[226779]: 2025-09-30 21:24:09.347829536 +0000 UTC m=+0.080047624 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:24:11 compute-0 nova_compute[192810]: 2025-09-30 21:24:11.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.385 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267437.3829067, 5da25813-a9d3-49a0-80ae-59c06fef8440 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.385 2 INFO nova.compute.manager [-] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] VM Stopped (Lifecycle Event)
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.423 2 DEBUG nova.compute.manager [None req-164fcca2-aff0-4d10-a934-09983623f9f3 - - - - - -] [instance: 5da25813-a9d3-49a0-80ae-59c06fef8440] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.883 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267437.882842, 246eff24-0065-40dd-87cf-c31afb91539f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.884 2 INFO nova.compute.manager [-] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] VM Stopped (Lifecycle Event)
Sep 30 21:24:12 compute-0 nova_compute[192810]: 2025-09-30 21:24:12.952 2 DEBUG nova.compute.manager [None req-a1ce1fbe-7510-481f-8707-445dacdfcb80 - - - - - -] [instance: 246eff24-0065-40dd-87cf-c31afb91539f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:16 compute-0 podman[226799]: 2025-09-30 21:24:16.318288106 +0000 UTC m=+0.052756999 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:24:16 compute-0 podman[226800]: 2025-09-30 21:24:16.330318524 +0000 UTC m=+0.063375782 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 21:24:16 compute-0 nova_compute[192810]: 2025-09-30 21:24:16.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:17 compute-0 nova_compute[192810]: 2025-09-30 21:24:17.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:19 compute-0 sshd-session[226842]: Invalid user foundry from 45.81.23.80 port 34722
Sep 30 21:24:19 compute-0 sshd-session[226842]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:24:19 compute-0 sshd-session[226842]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:24:21 compute-0 nova_compute[192810]: 2025-09-30 21:24:21.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:22 compute-0 sshd-session[226842]: Failed password for invalid user foundry from 45.81.23.80 port 34722 ssh2
Sep 30 21:24:22 compute-0 nova_compute[192810]: 2025-09-30 21:24:22.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:23 compute-0 podman[226845]: 2025-09-30 21:24:23.340437164 +0000 UTC m=+0.070659612 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:24:23 compute-0 sshd[128205]: drop connection #1 from [113.240.110.90]:33496 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:24:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:23.827 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:23 compute-0 nova_compute[192810]: 2025-09-30 21:24:23.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:23.829 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:24:24 compute-0 sshd-session[226842]: Received disconnect from 45.81.23.80 port 34722:11: Bye Bye [preauth]
Sep 30 21:24:24 compute-0 sshd-session[226842]: Disconnected from invalid user foundry 45.81.23.80 port 34722 [preauth]
Sep 30 21:24:25 compute-0 podman[226867]: 2025-09-30 21:24:25.368612801 +0000 UTC m=+0.090096284 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:24:25 compute-0 podman[226866]: 2025-09-30 21:24:25.380105616 +0000 UTC m=+0.110910900 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:24:26 compute-0 nova_compute[192810]: 2025-09-30 21:24:26.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:27 compute-0 nova_compute[192810]: 2025-09-30 21:24:27.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:28 compute-0 nova_compute[192810]: 2025-09-30 21:24:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:28 compute-0 nova_compute[192810]: 2025-09-30 21:24:28.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:24:29 compute-0 nova_compute[192810]: 2025-09-30 21:24:29.352 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:24:31 compute-0 nova_compute[192810]: 2025-09-30 21:24:31.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:32 compute-0 nova_compute[192810]: 2025-09-30 21:24:32.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:33.832 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:34 compute-0 podman[226913]: 2025-09-30 21:24:34.328820903 +0000 UTC m=+0.064396316 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:24:34 compute-0 podman[226912]: 2025-09-30 21:24:34.372819254 +0000 UTC m=+0.109151856 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.025 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.025 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.051 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.191 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.192 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.200 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.200 2 INFO nova.compute.claims [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.348 2 DEBUG nova.compute.provider_tree [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.371 2 DEBUG nova.scheduler.client.report [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.398 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.399 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.498 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.499 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.557 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.597 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.722 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.723 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.724 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Creating image(s)
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.725 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.725 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.726 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.745 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.838 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.839 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.839 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.850 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.901 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.902 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.938 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.939 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.940 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.996 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.998 2 DEBUG nova.virt.disk.api [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Checking if we can resize image /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:24:35 compute-0 nova_compute[192810]: 2025-09-30 21:24:35.998 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.060 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.062 2 DEBUG nova.virt.disk.api [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Cannot resize image /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.063 2 DEBUG nova.objects.instance [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'migration_context' on Instance uuid 9a9711a6-d998-4e3a-9ee0-f003c146de6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.113 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.114 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Ensure instance console log exists: /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.115 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.115 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.116 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.441 2 DEBUG nova.policy [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:36 compute-0 nova_compute[192810]: 2025-09-30 21:24:36.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:24:37 compute-0 nova_compute[192810]: 2025-09-30 21:24:37.501 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Successfully created port: 16e09506-1c57-4d4e-929f-be8c4514eede _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:24:37 compute-0 nova_compute[192810]: 2025-09-30 21:24:37.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:37 compute-0 nova_compute[192810]: 2025-09-30 21:24:37.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:38.728 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:38.729 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:38.729 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.258 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Successfully updated port: 16e09506-1c57-4d4e-929f-be8c4514eede _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.273 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.274 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquired lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.274 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.815 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:39 compute-0 nova_compute[192810]: 2025-09-30 21:24:39.825 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:24:40 compute-0 podman[226972]: 2025-09-30 21:24:40.332057606 +0000 UTC m=+0.065093914 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:24:40 compute-0 nova_compute[192810]: 2025-09-30 21:24:40.809 2 DEBUG nova.compute.manager [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-changed-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:40 compute-0 nova_compute[192810]: 2025-09-30 21:24:40.810 2 DEBUG nova.compute.manager [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Refreshing instance network info cache due to event network-changed-16e09506-1c57-4d4e-929f-be8c4514eede. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:24:40 compute-0 nova_compute[192810]: 2025-09-30 21:24:40.810 2 DEBUG oslo_concurrency.lockutils [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.194 2 DEBUG nova.network.neutron [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Updating instance_info_cache with network_info: [{"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.238 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Releasing lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.238 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Instance network_info: |[{"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.239 2 DEBUG oslo_concurrency.lockutils [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.240 2 DEBUG nova.network.neutron [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Refreshing network info cache for port 16e09506-1c57-4d4e-929f-be8c4514eede _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.244 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Start _get_guest_xml network_info=[{"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.251 2 WARNING nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.256 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.258 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.263 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.263 2 DEBUG nova.virt.libvirt.host [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.265 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.266 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.267 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.267 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.268 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.268 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.269 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.269 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.270 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.270 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.270 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.271 2 DEBUG nova.virt.hardware [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.278 2 DEBUG nova.virt.libvirt.vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-1',id=48,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:35Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=9a9711a6-d998-4e3a-9ee0-f003c146de6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.278 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.279 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.281 2 DEBUG nova.objects.instance [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9a9711a6-d998-4e3a-9ee0-f003c146de6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.300 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <uuid>9a9711a6-d998-4e3a-9ee0-f003c146de6f</uuid>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <name>instance-00000030</name>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:name>tempest-tempest.common.compute-instance-966346934-1</nova:name>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:24:41</nova:creationTime>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:user uuid="1a8900f8597741ad930d414e1db02d76">tempest-MultipleCreateTestJSON-413721927-project-member</nova:user>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:project uuid="34c754fa0f364622a4433b9ba5718857">tempest-MultipleCreateTestJSON-413721927</nova:project>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         <nova:port uuid="16e09506-1c57-4d4e-929f-be8c4514eede">
Sep 30 21:24:41 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <system>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="serial">9a9711a6-d998-4e3a-9ee0-f003c146de6f</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="uuid">9a9711a6-d998-4e3a-9ee0-f003c146de6f</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </system>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <os>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </os>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <features>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </features>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.config"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:0c:0a:78"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <target dev="tap16e09506-1c"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/console.log" append="off"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <video>
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </video>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:24:41 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:24:41 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:24:41 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:24:41 compute-0 nova_compute[192810]: </domain>
Sep 30 21:24:41 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.303 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Preparing to wait for external event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.304 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.304 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.305 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.306 2 DEBUG nova.virt.libvirt.vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-1',id=48,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:24:35Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=9a9711a6-d998-4e3a-9ee0-f003c146de6f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.307 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.308 2 DEBUG nova.network.os_vif_util [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.309 2 DEBUG os_vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.311 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.315 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16e09506-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.316 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap16e09506-1c, col_values=(('external_ids', {'iface-id': '16e09506-1c57-4d4e-929f-be8c4514eede', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:0a:78', 'vm-uuid': '9a9711a6-d998-4e3a-9ee0-f003c146de6f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-0 NetworkManager[51733]: <info>  [1759267481.3504] manager: (tap16e09506-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.358 2 INFO os_vif [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c')
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.444 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.445 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.445 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] No VIF found with MAC fa:16:3e:0c:0a:78, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.445 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Using config drive
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:41 compute-0 nova_compute[192810]: 2025-09-30 21:24:41.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.418 2 INFO nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Creating config drive at /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.config
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.424 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmparqvrzey execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.551 2 DEBUG oslo_concurrency.processutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmparqvrzey" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:42 compute-0 kernel: tap16e09506-1c: entered promiscuous mode
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.6185] manager: (tap16e09506-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-0 ovn_controller[94912]: 2025-09-30T21:24:42Z|00191|binding|INFO|Claiming lport 16e09506-1c57-4d4e-929f-be8c4514eede for this chassis.
Sep 30 21:24:42 compute-0 ovn_controller[94912]: 2025-09-30T21:24:42Z|00192|binding|INFO|16e09506-1c57-4d4e-929f-be8c4514eede: Claiming fa:16:3e:0c:0a:78 10.100.0.5
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.642 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:0a:78 10.100.0.5'], port_security=['fa:16:3e:0c:0a:78 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=16e09506-1c57-4d4e-929f-be8c4514eede) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.643 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 16e09506-1c57-4d4e-929f-be8c4514eede in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 bound to our chassis
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.645 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.656 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b6bbe36b-b573-42ae-b1ad-d30d91383871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.657 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4180897-f1 in ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.659 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4180897-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.659 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[beae54e1-4fda-46ed-ba8f-a60ba171c1c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 systemd-udevd[227014]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.661 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[65c9b393-0cd2-459a-93f6-8d602df1ba4e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 systemd-machined[152794]: New machine qemu-24-instance-00000030.
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.6743] device (tap16e09506-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.6749] device (tap16e09506-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.678 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1eaab460-3632-4d64-a284-2bf3eea72210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-00000030.
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.703 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d37cb5da-43d2-43b7-8ec7-cf5c6a0200ee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_controller[94912]: 2025-09-30T21:24:42Z|00193|binding|INFO|Setting lport 16e09506-1c57-4d4e-929f-be8c4514eede ovn-installed in OVS
Sep 30 21:24:42 compute-0 ovn_controller[94912]: 2025-09-30T21:24:42Z|00194|binding|INFO|Setting lport 16e09506-1c57-4d4e-929f-be8c4514eede up in Southbound
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.727 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[176c6a0e-cabd-4f2e-819c-91c3b40aec73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.731 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4c6a62-b306-488e-915f-c2bf19373795]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.7328] manager: (tapf4180897-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Sep 30 21:24:42 compute-0 systemd-udevd[227017]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.770 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6dacb5ba-0011-4fb9-a957-b2c91df7b30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.773 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[28f076dc-7989-403a-aa74-3d8561fa8a23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.7945] device (tapf4180897-f0): carrier: link connected
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.802 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7eebd898-8a5b-42ee-8ce0-d6bfb4e27ad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.819 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7fed757b-dc7f-4d9e-a069-dfb697270da5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417841, 'reachable_time': 27826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227046, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.834 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7caa7236-5966-49d2-8ea3-fff1860fc16e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:a9c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 417841, 'tstamp': 417841}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227047, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.850 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e58562-9cee-4e3b-ab40-c7b31c6326cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4180897-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:a9:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417841, 'reachable_time': 27826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227048, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.883 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ffac3df1-1b5f-4bcc-ba54-9a3a8ab77765]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.942 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[df113348-8d8b-4ca2-a6ba-906c4eae9af3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.944 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.944 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.945 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4180897-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:42 compute-0 kernel: tapf4180897-f0: entered promiscuous mode
Sep 30 21:24:42 compute-0 NetworkManager[51733]: <info>  [1759267482.9479] manager: (tapf4180897-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.950 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4180897-f0, col_values=(('external_ids', {'iface-id': 'f5229ae3-c5a0-4511-b62d-05f7c88709d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:42 compute-0 ovn_controller[94912]: 2025-09-30T21:24:42Z|00195|binding|INFO|Releasing lport f5229ae3-c5a0-4511-b62d-05f7c88709d9 from this chassis (sb_readonly=0)
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.953 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.953 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2f00a4-619b-4656-a1f1-3fafa0e562b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.954 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f4180897-fb47-4ee3-b86e-380da38f2ec5.pid.haproxy
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f4180897-fb47-4ee3-b86e-380da38f2ec5
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:24:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:42.956 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'env', 'PROCESS_TAG=haproxy-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4180897-fb47-4ee3-b86e-380da38f2ec5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:42 compute-0 nova_compute[192810]: 2025-09-30 21:24:42.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:43 compute-0 podman[227087]: 2025-09-30 21:24:43.335625782 +0000 UTC m=+0.050920083 container create f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.342 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267483.3413866, 9a9711a6-d998-4e3a-9ee0-f003c146de6f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.342 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] VM Started (Lifecycle Event)
Sep 30 21:24:43 compute-0 systemd[1]: Started libpod-conmon-f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4.scope.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.376 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.380 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267483.3414874, 9a9711a6-d998-4e3a-9ee0-f003c146de6f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.380 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] VM Paused (Lifecycle Event)
Sep 30 21:24:43 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:24:43 compute-0 podman[227087]: 2025-09-30 21:24:43.310593502 +0000 UTC m=+0.025887823 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:24:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02a86578bd830f74721e147a5dcb28faad58459c5110790eb87ffe1d7c64049/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.411 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.413 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:43 compute-0 podman[227087]: 2025-09-30 21:24:43.426239817 +0000 UTC m=+0.141534138 container init f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:24:43 compute-0 podman[227087]: 2025-09-30 21:24:43.434193444 +0000 UTC m=+0.149487765 container start f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.450 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:43 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [NOTICE]   (227106) : New worker (227108) forked
Sep 30 21:24:43 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [NOTICE]   (227106) : Loading success.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.489 2 DEBUG nova.compute.manager [req-4fb66495-7408-49a4-95ea-9af3d13ce2a9 req-1fb516a1-df1f-40e9-9cc5-8f6acfb910b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.490 2 DEBUG oslo_concurrency.lockutils [req-4fb66495-7408-49a4-95ea-9af3d13ce2a9 req-1fb516a1-df1f-40e9-9cc5-8f6acfb910b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.490 2 DEBUG oslo_concurrency.lockutils [req-4fb66495-7408-49a4-95ea-9af3d13ce2a9 req-1fb516a1-df1f-40e9-9cc5-8f6acfb910b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.490 2 DEBUG oslo_concurrency.lockutils [req-4fb66495-7408-49a4-95ea-9af3d13ce2a9 req-1fb516a1-df1f-40e9-9cc5-8f6acfb910b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.490 2 DEBUG nova.compute.manager [req-4fb66495-7408-49a4-95ea-9af3d13ce2a9 req-1fb516a1-df1f-40e9-9cc5-8f6acfb910b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Processing event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.491 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.495 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267483.494859, 9a9711a6-d998-4e3a-9ee0-f003c146de6f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.496 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] VM Resumed (Lifecycle Event)
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.497 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.502 2 INFO nova.virt.libvirt.driver [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Instance spawned successfully.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.503 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.541 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.551 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.554 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.554 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.555 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.555 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.556 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.556 2 DEBUG nova.virt.libvirt.driver [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.634 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.675 2 INFO nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Took 7.95 seconds to spawn the instance on the hypervisor.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.675 2 DEBUG nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.773 2 INFO nova.compute.manager [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Took 8.62 seconds to build instance.
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.782 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.805 2 DEBUG oslo_concurrency.lockutils [None req-8c3a7000-c4a0-4dc6-97f0-1a7acb1fa6fe 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.779s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.824 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.824 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:24:43 compute-0 nova_compute[192810]: 2025-09-30 21:24:43.900 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.908 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'name': 'tempest-tempest.common.compute-instance-966346934-1', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000030', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '34c754fa0f364622a4433b9ba5718857', 'user_id': '1a8900f8597741ad930d414e1db02d76', 'hostId': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.937 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.938 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05c88c76-ec32-4600-b362-47aaf8e6c324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.909969', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1f4956a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '39a5773296c9e247740f718f5cad0390231b407b253869931141c3b4e8f00dc3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.909969', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1f4a780-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '2e874c55f436e4f60db8af80d5345416c613e5ff8d6f38189b490733a385ae52'}]}, 'timestamp': '2025-09-30 21:24:43.939198', '_unique_id': '9aef734746694b86965b8bc5c4e607d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.942 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.942 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>]
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.943 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.947 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9a9711a6-d998-4e3a-9ee0-f003c146de6f / tap16e09506-1c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.948 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09955caf-1ade-4bf7-adac-0b98c4790865', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:43.943340', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1f61f70-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': '044008275f8c867d4f6b980ec02438a3c54e72981da95609001656813c170a85'}]}, 'timestamp': '2025-09-30 21:24:43.948932', '_unique_id': '2acf15ab5be9486c9e6c02c32a8c9065'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.951 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.951 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.latency volume: 3308362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eda66dbf-70d7-4b4c-9073-486a4cf1f781', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.951462', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1f696ee-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'c3418694491db13174c7942bc91da29a562bc73f74370c02a531221d8486b82b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3308362, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.951462', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1f6a300-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'b1c7acd12bce2fb793dec4007b0497b4e44c9a2a7898fbc23f0d55d85b6aa410'}]}, 'timestamp': '2025-09-30 21:24:43.952108', '_unique_id': 'fb877beb3a5e409bb90a48968409631d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.953 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.954 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>]
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.954 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.954 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.954 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60389063-259a-4e12-9cf9-6865156148b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.954280', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1f701d8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '85c9ba3685486ce7752861e5c6c7df0760747451b6f3f39c8ef82c30e74e0a89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.954280', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1f70e8a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '1618bf565af99c1d92229f7adac4294b2ba75676baff9f92e9719f19ecfda4c2'}]}, 'timestamp': '2025-09-30 21:24:43.954896', '_unique_id': 'b14316cd308242019d0ae4735f46d3bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.955 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.956 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.956 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2bd8eee3-5d72-42b7-86fb-602de1e0d2c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:43.956642', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1f75e4e-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': 'ce996a6c403012763e6e7a8a0ea7c48a41ffd6e8fe133c564db3b365370a8308'}]}, 'timestamp': '2025-09-30 21:24:43.956922', '_unique_id': 'd728d5ccf87f4976a8e17259c90d8804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.957 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.958 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.958 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.958 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023f4708-0cb7-4ac3-86db-1b2d2135ba3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.958158', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1f799e0-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'e3d04141c448219a4186e36b3ae1673eb125ff46ce36762e073d41ac131370a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.958158', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1f7a5d4-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '5e0a17035cd3fcca1fedf2accf216da747442466fef8cfb41ee641f7ca19ef0a'}]}, 'timestamp': '2025-09-30 21:24:43.958773', '_unique_id': '0c2e9c0866f744c8ad2000532aeec4b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.960 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.978 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.979 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 9a9711a6-d998-4e3a-9ee0-f003c146de6f: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.979 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '500ee4f1-1c08-41f3-8380-bc8e34eb840d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:43.979248', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1fad682-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': 'b7397bc51518691001a64e1c1a3d97b46358c670b04dbefb97e0bbe4ed3cfd4e'}]}, 'timestamp': '2025-09-30 21:24:43.979804', '_unique_id': '177ed2d9cdbf4496b38bffaf775c292e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.981 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd28c12e-ab7d-4a3d-970b-0acdbdf1c4b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:43.981990', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1fb3bb8-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': '9ed0c3fc93e77cf30a37fbed4f62b8831de9e9eed68c901afd216e049898805e'}]}, 'timestamp': '2025-09-30 21:24:43.982246', '_unique_id': 'ac0d298de33c4524a110f52390f08fa9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.983 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.983 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36bba1e9-dd54-4a73-9029-2e0becb0a7a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.983408', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1fb72ae-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'afce64529875035008f573368e6d16f88e60983bf661ce6d14cc6eefc13cd696'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.983408', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1fb7d1c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': '84e4fb9c3a4293a5db985479552fde837b0861b7e4c4f65d0c4de1e4f9c8c3d5'}]}, 'timestamp': '2025-09-30 21:24:43.983899', '_unique_id': '09fe7f30dd37491d9ce3140317a69ffa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.996 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.997 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07d8bcbb-05cd-4f15-965e-90c7faa43b16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:43.985085', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1fd7af4-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': '6efa1e894ab72741ae5ff7dba79ba629a21e17f2984cd27de06da3b264c97fed'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:43.985085', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1fd86ac-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': 'b48a3b2f91fbbefc8876c99305e2bdfd833dcd5b8fbc339ce1b4d52ef67f6c56'}]}, 'timestamp': '2025-09-30 21:24:43.997266', '_unique_id': '27c3a0d4cbef4499806308b5663cf06b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:24:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:43.999 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3d9d531-d721-42d4-97c1-a85c42833a4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:43.999380', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1fde2fa-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': '87f02c5478f28ed23ceddcc38bf027da84d0696fdbf9c96117883ee75135195f'}]}, 'timestamp': '2025-09-30 21:24:43.999641', '_unique_id': 'ecb629354c4a42448267e868f0679a91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.000 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b1457b4-a450-4a48-8716-0082d2b72f10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:44.000754', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1fe181a-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': 'ac1cd0ca6c1053e5a9cfbb3075bba8cbfb3b0a254c3e5fce11be153fb8e7ad36'}]}, 'timestamp': '2025-09-30 21:24:44.001005', '_unique_id': 'cf7485ff5645499cb9cdc5e69612edff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.001 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.002 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cecedb82-70b8-444c-9909-4cae38c84932', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:44.002350', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1fe56d6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': 'b4e88cc06ca3ea2c696ccb9a27570ac5ebc8eef75840b34f1ea176aa365c1465'}]}, 'timestamp': '2025-09-30 21:24:44.002651', '_unique_id': 'a8deeac6e2b94617bb580e63d85ac203'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.003 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.004 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.004 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '040f66f2-c7ed-49b9-924a-16a8e8fab8e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:44.004033', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1fe98da-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'd335db7d2fa0cab5a64893d5b3de651dd3e86b00dd11cd96cbb6f3a90d2d6bdc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:44.004033', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1fea906-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.597483854, 'message_signature': 'c228c315a27a32420770163da6d75fc2d37501bfd3ebd713502b9d2acb73c978'}]}, 'timestamp': '2025-09-30 21:24:44.004695', '_unique_id': '479f3a1af0434480a50d1f877fb3f60f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.005 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.006 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.006 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e99267a-f6d2-488a-8584-2610253242b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:44.005977', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1fee592-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': '3bac5e73127bec7ea7a8963e76f950d07ea37ab635c4158282a7a2b00fddbb66'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:44.005977', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1feefe2-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': '0d199068e083c2207ae69d34d3d3b487aad6cd50bf5db297f12c6676585554c5'}]}, 'timestamp': '2025-09-30 21:24:44.006532', '_unique_id': '9314eb281f204a67bf28987b97091a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.007 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2886e297-3e12-4bb7-a966-31cf36dbcea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-vda', 'timestamp': '2025-09-30T21:24:44.007780', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e1ff2ba6-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': 'b25f13f87f9cde3fa819bb7753ebe85fecd9798ed5fcf32b8330ae04885b5cb9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f-sda', 'timestamp': '2025-09-30T21:24:44.007780', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e1ff3538-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.672612096, 'message_signature': 'cb471118020cf1d2a6e4631caf1bb06e0da105627aa0ec22293a7dd15bc11a8d'}]}, 'timestamp': '2025-09-30 21:24:44.008267', '_unique_id': 'f1b8848210ef48d884381ae1b2bce8df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.008 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.009 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.009 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52d425b5-0406-4b90-ac33-2fa2da2ebec6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:44.009414', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1ff6c74-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': '05dbcbea6fcb35770bd92b0e6eed2b110ab8f67673cf72762558d89975fb9ed1'}]}, 'timestamp': '2025-09-30 21:24:44.009699', '_unique_id': '8f0409fff61142d4a38bf5883eb599e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.010 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf2480e5-cb05-451b-aaf0-085e3594ee74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:44.010795', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1ffa02c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': '5d600c192873cbec29874759a41886f5168b83cddaf95b1510dc1b2be2246d83'}]}, 'timestamp': '2025-09-30 21:24:44.011019', '_unique_id': 'd2954092e7de45e7a9f991bad1145a91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '735da040-d909-400d-8308-f3f481a6714b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': 'instance-00000030-9a9711a6-d998-4e3a-9ee0-f003c146de6f-tap16e09506-1c', 'timestamp': '2025-09-30T21:24:44.012049', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'tap16e09506-1c', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0c:0a:78', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap16e09506-1c'}, 'message_id': 'e1ffd18c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.630909722, 'message_signature': 'dfc79d226aaf91fcd3700c288e95a155af91addfd055fa7333947ff6e29a3ce7'}]}, 'timestamp': '2025-09-30 21:24:44.012287', '_unique_id': 'af7c106b411f4c58b74090a7697298f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.012 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>]
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.013 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-tempest.common.compute-instance-966346934-1>]
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 DEBUG ceilometer.compute.pollsters [-] 9a9711a6-d998-4e3a-9ee0-f003c146de6f/cpu volume: 440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cdd77c8-9412-4b77-9bd7-30a1e76899d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 440000000, 'user_id': '1a8900f8597741ad930d414e1db02d76', 'user_name': None, 'project_id': '34c754fa0f364622a4433b9ba5718857', 'project_name': None, 'resource_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'timestamp': '2025-09-30T21:24:44.014060', 'resource_metadata': {'display_name': 'tempest-tempest.common.compute-instance-966346934-1', 'name': 'instance-00000030', 'instance_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'instance_type': 'm1.nano', 'host': 'dc81ec0910ffce46e516e475dce4dc4c52ad8f7e6923db1bec385e2a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e200204c-9e43-11f0-a153-fa163e09b122', 'monotonic_time': 4179.665972781, 'message_signature': '4b857bab7969dd1d376ee3156051b1ed3b3a413882610ec06654a3b40442c83a'}]}, 'timestamp': '2025-09-30 21:24:44.014325', '_unique_id': 'dcc24259b41443aca8c146d1cd694180'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:24:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:24:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.017 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.020 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.076 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.235 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.236 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.42780685424805GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.236 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.237 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.504 2 DEBUG nova.network.neutron [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Updated VIF entry in instance network info cache for port 16e09506-1c57-4d4e-929f-be8c4514eede. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.504 2 DEBUG nova.network.neutron [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Updating instance_info_cache with network_info: [{"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.522 2 DEBUG oslo_concurrency.lockutils [req-1926f494-6f3d-4637-a19f-33eff2d03444 req-ed223e5e-7fdf-47e5-b3ef-7382b9d841e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-9a9711a6-d998-4e3a-9ee0-f003c146de6f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.548 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 9a9711a6-d998-4e3a-9ee0-f003c146de6f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.549 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.549 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.641 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.664 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.701 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:24:44 compute-0 nova_compute[192810]: 2025-09-30 21:24:44.701 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.700 2 DEBUG nova.compute.manager [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.700 2 DEBUG oslo_concurrency.lockutils [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.701 2 DEBUG oslo_concurrency.lockutils [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.701 2 DEBUG oslo_concurrency.lockutils [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.701 2 DEBUG nova.compute.manager [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] No waiting events found dispatching network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.701 2 WARNING nova.compute.manager [req-d4098464-2e7f-4e7a-a568-b86628964658 req-4eafeadb-2bc7-4303-9edd-4a5b8c18e493 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received unexpected event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede for instance with vm_state active and task_state None.
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.702 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:45 compute-0 nova_compute[192810]: 2025-09-30 21:24:45.702 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:46 compute-0 nova_compute[192810]: 2025-09-30 21:24:46.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:46 compute-0 nova_compute[192810]: 2025-09-30 21:24:46.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:46 compute-0 nova_compute[192810]: 2025-09-30 21:24:46.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:24:46 compute-0 nova_compute[192810]: 2025-09-30 21:24:46.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:24:46 compute-0 nova_compute[192810]: 2025-09-30 21:24:46.821 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:24:47 compute-0 podman[227124]: 2025-09-30 21:24:47.373871403 +0000 UTC m=+0.090981144 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:24:47 compute-0 podman[227125]: 2025-09-30 21:24:47.379158604 +0000 UTC m=+0.097121066 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.855 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.856 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.856 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.856 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.856 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.869 2 INFO nova.compute.manager [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Terminating instance
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.882 2 DEBUG nova.compute.manager [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:24:47 compute-0 kernel: tap16e09506-1c (unregistering): left promiscuous mode
Sep 30 21:24:47 compute-0 NetworkManager[51733]: <info>  [1759267487.9074] device (tap16e09506-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:24:47 compute-0 ovn_controller[94912]: 2025-09-30T21:24:47Z|00196|binding|INFO|Releasing lport 16e09506-1c57-4d4e-929f-be8c4514eede from this chassis (sb_readonly=0)
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:47 compute-0 ovn_controller[94912]: 2025-09-30T21:24:47Z|00197|binding|INFO|Setting lport 16e09506-1c57-4d4e-929f-be8c4514eede down in Southbound
Sep 30 21:24:47 compute-0 ovn_controller[94912]: 2025-09-30T21:24:47Z|00198|binding|INFO|Removing iface tap16e09506-1c ovn-installed in OVS
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:47 compute-0 nova_compute[192810]: 2025-09-30 21:24:47.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:47.936 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:0a:78 10.100.0.5'], port_security=['fa:16:3e:0c:0a:78 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9a9711a6-d998-4e3a-9ee0-f003c146de6f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '34c754fa0f364622a4433b9ba5718857', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1c375ca4-4abf-4405-95d5-43748e715058', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f21a560c-012c-4374-b9ef-0dc124a433df, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=16e09506-1c57-4d4e-929f-be8c4514eede) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:24:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:47.939 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 16e09506-1c57-4d4e-929f-be8c4514eede in datapath f4180897-fb47-4ee3-b86e-380da38f2ec5 unbound from our chassis
Sep 30 21:24:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:47.940 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4180897-fb47-4ee3-b86e-380da38f2ec5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:24:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:47.941 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[02bacd67-1ac6-4909-9d17-561fef96446c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:47.941 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 namespace which is not needed anymore
Sep 30 21:24:47 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Deactivated successfully.
Sep 30 21:24:47 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000030.scope: Consumed 5.094s CPU time.
Sep 30 21:24:47 compute-0 systemd-machined[152794]: Machine qemu-24-instance-00000030 terminated.
Sep 30 21:24:48 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [NOTICE]   (227106) : haproxy version is 2.8.14-c23fe91
Sep 30 21:24:48 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [NOTICE]   (227106) : path to executable is /usr/sbin/haproxy
Sep 30 21:24:48 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [WARNING]  (227106) : Exiting Master process...
Sep 30 21:24:48 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [ALERT]    (227106) : Current worker (227108) exited with code 143 (Terminated)
Sep 30 21:24:48 compute-0 neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5[227102]: [WARNING]  (227106) : All workers exited. Exiting... (0)
Sep 30 21:24:48 compute-0 systemd[1]: libpod-f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4.scope: Deactivated successfully.
Sep 30 21:24:48 compute-0 podman[227191]: 2025-09-30 21:24:48.079049767 +0000 UTC m=+0.046295818 container died f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-a02a86578bd830f74721e147a5dcb28faad58459c5110790eb87ffe1d7c64049-merged.mount: Deactivated successfully.
Sep 30 21:24:48 compute-0 podman[227191]: 2025-09-30 21:24:48.122516884 +0000 UTC m=+0.089762935 container cleanup f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:24:48 compute-0 systemd[1]: libpod-conmon-f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4.scope: Deactivated successfully.
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.151 2 INFO nova.virt.libvirt.driver [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Instance destroyed successfully.
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.151 2 DEBUG nova.objects.instance [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lazy-loading 'resources' on Instance uuid 9a9711a6-d998-4e3a-9ee0-f003c146de6f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.170 2 DEBUG nova.virt.libvirt.vif [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-966346934',display_name='tempest-tempest.common.compute-instance-966346934-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-966346934-1',id=48,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:24:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='34c754fa0f364622a4433b9ba5718857',ramdisk_id='',reservation_id='r-zs1xvy19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-413721927',owner_user_name='tempest-MultipleCreateTestJSON-413721927-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:24:43Z,user_data=None,user_id='1a8900f8597741ad930d414e1db02d76',uuid=9a9711a6-d998-4e3a-9ee0-f003c146de6f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.171 2 DEBUG nova.network.os_vif_util [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converting VIF {"id": "16e09506-1c57-4d4e-929f-be8c4514eede", "address": "fa:16:3e:0c:0a:78", "network": {"id": "f4180897-fb47-4ee3-b86e-380da38f2ec5", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1027240753-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "34c754fa0f364622a4433b9ba5718857", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap16e09506-1c", "ovs_interfaceid": "16e09506-1c57-4d4e-929f-be8c4514eede", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.172 2 DEBUG nova.network.os_vif_util [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.172 2 DEBUG os_vif [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.175 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16e09506-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.181 2 INFO os_vif [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:0a:78,bridge_name='br-int',has_traffic_filtering=True,id=16e09506-1c57-4d4e-929f-be8c4514eede,network=Network(f4180897-fb47-4ee3-b86e-380da38f2ec5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap16e09506-1c')
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.182 2 INFO nova.virt.libvirt.driver [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Deleting instance files /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f_del
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.183 2 INFO nova.virt.libvirt.driver [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Deletion of /var/lib/nova/instances/9a9711a6-d998-4e3a-9ee0-f003c146de6f_del complete
Sep 30 21:24:48 compute-0 podman[227233]: 2025-09-30 21:24:48.188802197 +0000 UTC m=+0.045156650 container remove f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.195 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[480ba475-1f9c-4aae-a9eb-80da5ec4bc69]: (4, ('Tue Sep 30 09:24:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4)\nf9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4\nTue Sep 30 09:24:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 (f9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4)\nf9f7c9573bccd47a245abfec590a9e9cd5929fb5300f24ef43a897d09a265ed4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.196 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[df4c1474-73b5-4768-89a9-e07769000e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.197 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4180897-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 kernel: tapf4180897-f0: left promiscuous mode
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.204 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89058126-bc90-4051-8cbf-3b37ec87168d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.233 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6658290c-a7d3-4891-8e8f-a1a6379835b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.234 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4e0cc6b3-5f42-4c63-bacc-a4b0cd1c2ae6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.251 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0aba0a-b907-468e-9575-b083711cb7c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 417834, 'reachable_time': 27481, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227250, 'error': None, 'target': 'ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 systemd[1]: run-netns-ovnmeta\x2df4180897\x2dfb47\x2d4ee3\x2db86e\x2d380da38f2ec5.mount: Deactivated successfully.
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.256 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4180897-fb47-4ee3-b86e-380da38f2ec5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:24:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:24:48.256 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf4e901-21d9-4772-9f61-474ec41c7b6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.292 2 INFO nova.compute.manager [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.293 2 DEBUG oslo.service.loopingcall [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.293 2 DEBUG nova.compute.manager [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.293 2 DEBUG nova.network.neutron [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.499 2 DEBUG nova.compute.manager [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-unplugged-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.500 2 DEBUG oslo_concurrency.lockutils [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.501 2 DEBUG oslo_concurrency.lockutils [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.501 2 DEBUG oslo_concurrency.lockutils [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.502 2 DEBUG nova.compute.manager [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] No waiting events found dispatching network-vif-unplugged-16e09506-1c57-4d4e-929f-be8c4514eede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:48 compute-0 nova_compute[192810]: 2025-09-30 21:24:48.503 2 DEBUG nova.compute.manager [req-f65b8870-2fa6-45f9-9a9c-26606a5eed58 req-6ce8ce76-f257-40fd-a36e-09345456f7da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-unplugged-16e09506-1c57-4d4e-929f-be8c4514eede for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.769 2 DEBUG nova.network.neutron [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.791 2 INFO nova.compute.manager [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Took 1.50 seconds to deallocate network for instance.
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.882 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.883 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.979 2 DEBUG nova.compute.provider_tree [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.987 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:49 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.988 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:49.999 2 DEBUG nova.scheduler.client.report [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.004 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.029 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.057 2 INFO nova.scheduler.client.report [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Deleted allocations for instance 9a9711a6-d998-4e3a-9ee0-f003c146de6f
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.111 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.112 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.128 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.129 2 INFO nova.compute.claims [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.200 2 DEBUG oslo_concurrency.lockutils [None req-54afa011-cbfb-41e4-ac3c-307d3cad818e 1a8900f8597741ad930d414e1db02d76 34c754fa0f364622a4433b9ba5718857 - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.397 2 DEBUG nova.compute.provider_tree [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.419 2 DEBUG nova.scheduler.client.report [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.446 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.447 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.542 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.591 2 INFO nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.607 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.694 2 DEBUG nova.compute.manager [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.694 2 DEBUG oslo_concurrency.lockutils [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.695 2 DEBUG oslo_concurrency.lockutils [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.695 2 DEBUG oslo_concurrency.lockutils [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "9a9711a6-d998-4e3a-9ee0-f003c146de6f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.695 2 DEBUG nova.compute.manager [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] No waiting events found dispatching network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.696 2 WARNING nova.compute.manager [req-7007ebf0-28e6-4273-a17b-8882dcaff698 req-2835e608-c820-4418-9534-f844b29fdabb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received unexpected event network-vif-plugged-16e09506-1c57-4d4e-929f-be8c4514eede for instance with vm_state deleted and task_state None.
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.792 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.794 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.794 2 INFO nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating image(s)
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.795 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.795 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.796 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.812 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.871 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.873 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.875 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.903 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.977 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:50 compute-0 nova_compute[192810]: 2025-09-30 21:24:50.980 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.024 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.026 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.027 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.122 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.123 2 DEBUG nova.virt.disk.api [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Checking if we can resize image /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.124 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.186 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.188 2 DEBUG nova.virt.disk.api [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Cannot resize image /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.188 2 DEBUG nova.objects.instance [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'migration_context' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.211 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.212 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Ensure instance console log exists: /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.212 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.212 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.213 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.214 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.219 2 WARNING nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.224 2 DEBUG nova.virt.libvirt.host [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.224 2 DEBUG nova.virt.libvirt.host [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.232 2 DEBUG nova.virt.libvirt.host [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.233 2 DEBUG nova.virt.libvirt.host [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.234 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.234 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.234 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.234 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.235 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.235 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.235 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.235 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.235 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.236 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.236 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.237 2 DEBUG nova.virt.hardware [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.240 2 DEBUG nova.objects.instance [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.260 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <uuid>2aa71932-03fb-4f75-b359-e1ff961fd8f6</uuid>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <name>instance-00000033</name>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1659572675</nova:name>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:24:51</nova:creationTime>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:user uuid="bd6d636573fd4d899b5e6f4a1f7e1790">tempest-UnshelveToHostMultiNodesTest-98679670-project-member</nova:user>
Sep 30 21:24:51 compute-0 nova_compute[192810]:         <nova:project uuid="1124d87211d04502bce5e44c25ed5d3c">tempest-UnshelveToHostMultiNodesTest-98679670</nova:project>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <system>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="serial">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="uuid">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </system>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <os>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </os>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <features>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </features>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log" append="off"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <video>
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </video>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:24:51 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:24:51 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:24:51 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:24:51 compute-0 nova_compute[192810]: </domain>
Sep 30 21:24:51 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.316 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.317 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.317 2 INFO nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Using config drive
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.560 2 INFO nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating config drive at /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.565 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1jop90j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.705 2 DEBUG oslo_concurrency.processutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb1jop90j" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:51 compute-0 nova_compute[192810]: 2025-09-30 21:24:51.771 2 DEBUG nova.compute.manager [req-a7f8e5d6-eaf0-4019-94bc-1ff84690cf41 req-c9ed3b75-a49e-4db6-9473-f6b1a8123feb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Received event network-vif-deleted-16e09506-1c57-4d4e-929f-be8c4514eede external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:24:51 compute-0 systemd-machined[152794]: New machine qemu-25-instance-00000033.
Sep 30 21:24:51 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000033.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.521 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267492.520103, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.523 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Resumed (Lifecycle Event)
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.525 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.526 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.530 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance spawned successfully.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.531 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.553 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.558 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.559 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.559 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.560 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.560 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.561 2 DEBUG nova.virt.libvirt.driver [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.565 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.610 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.611 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267492.5212264, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.611 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Started (Lifecycle Event)
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.646 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.650 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.675 2 INFO nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Took 1.88 seconds to spawn the instance on the hypervisor.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.676 2 DEBUG nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.678 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.766 2 INFO nova.compute.manager [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Took 2.69 seconds to build instance.
Sep 30 21:24:52 compute-0 nova_compute[192810]: 2025-09-30 21:24:52.784 2 DEBUG oslo_concurrency.lockutils [None req-cc2a14d3-52c5-417a-9ae0-41e0f5ba6d82 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:24:53 compute-0 nova_compute[192810]: 2025-09-30 21:24:53.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:54 compute-0 podman[227294]: 2025-09-30 21:24:54.374628844 +0000 UTC m=+0.075372959 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:24:55 compute-0 nova_compute[192810]: 2025-09-30 21:24:55.878 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:24:55 compute-0 nova_compute[192810]: 2025-09-30 21:24:55.879 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:24:55 compute-0 nova_compute[192810]: 2025-09-30 21:24:55.879 2 INFO nova.compute.manager [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shelving
Sep 30 21:24:55 compute-0 nova_compute[192810]: 2025-09-30 21:24:55.945 2 DEBUG nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:24:56 compute-0 podman[227315]: 2025-09-30 21:24:56.328742203 +0000 UTC m=+0.059586917 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:24:56 compute-0 podman[227314]: 2025-09-30 21:24:56.3435448 +0000 UTC m=+0.071212515 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:24:56 compute-0 nova_compute[192810]: 2025-09-30 21:24:56.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:24:58 compute-0 nova_compute[192810]: 2025-09-30 21:24:58.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:01 compute-0 nova_compute[192810]: 2025-09-30 21:25:01.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:03 compute-0 nova_compute[192810]: 2025-09-30 21:25:03.150 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267488.149605, 9a9711a6-d998-4e3a-9ee0-f003c146de6f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:03 compute-0 nova_compute[192810]: 2025-09-30 21:25:03.151 2 INFO nova.compute.manager [-] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] VM Stopped (Lifecycle Event)
Sep 30 21:25:03 compute-0 nova_compute[192810]: 2025-09-30 21:25:03.183 2 DEBUG nova.compute.manager [None req-f7fcbbc1-a981-43a8-bd8b-ad6a556d5e85 - - - - - -] [instance: 9a9711a6-d998-4e3a-9ee0-f003c146de6f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:03 compute-0 nova_compute[192810]: 2025-09-30 21:25:03.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:05 compute-0 podman[227367]: 2025-09-30 21:25:05.325474521 +0000 UTC m=+0.062307305 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:25:05 compute-0 podman[227366]: 2025-09-30 21:25:05.345839695 +0000 UTC m=+0.087100409 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:05 compute-0 nova_compute[192810]: 2025-09-30 21:25:05.993 2 DEBUG nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:25:06 compute-0 nova_compute[192810]: 2025-09-30 21:25:06.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:08 compute-0 nova_compute[192810]: 2025-09-30 21:25:08.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:08 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000033.scope: Deactivated successfully.
Sep 30 21:25:08 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000033.scope: Consumed 12.319s CPU time.
Sep 30 21:25:08 compute-0 systemd-machined[152794]: Machine qemu-25-instance-00000033 terminated.
Sep 30 21:25:09 compute-0 nova_compute[192810]: 2025-09-30 21:25:09.006 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance shutdown successfully after 13 seconds.
Sep 30 21:25:09 compute-0 nova_compute[192810]: 2025-09-30 21:25:09.014 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:09 compute-0 nova_compute[192810]: 2025-09-30 21:25:09.014 2 DEBUG nova.objects.instance [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'numa_topology' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:09 compute-0 nova_compute[192810]: 2025-09-30 21:25:09.748 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Beginning cold snapshot process
Sep 30 21:25:10 compute-0 nova_compute[192810]: 2025-09-30 21:25:10.091 2 DEBUG nova.privsep.utils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:25:10 compute-0 nova_compute[192810]: 2025-09-30 21:25:10.092 2 DEBUG oslo_concurrency.processutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk /var/lib/nova/instances/snapshots/tmphydgs6c0/3d2f57a0cb8647028f8fc08f6d7198a0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:10 compute-0 nova_compute[192810]: 2025-09-30 21:25:10.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:10.139 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:10.139 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:25:10 compute-0 nova_compute[192810]: 2025-09-30 21:25:10.587 2 DEBUG oslo_concurrency.processutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk /var/lib/nova/instances/snapshots/tmphydgs6c0/3d2f57a0cb8647028f8fc08f6d7198a0" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:10 compute-0 nova_compute[192810]: 2025-09-30 21:25:10.589 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Snapshot extracted, beginning image upload
Sep 30 21:25:11 compute-0 podman[227433]: 2025-09-30 21:25:11.367393502 +0000 UTC m=+0.091441947 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 21:25:11 compute-0 nova_compute[192810]: 2025-09-30 21:25:11.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:13.142 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:13 compute-0 nova_compute[192810]: 2025-09-30 21:25:13.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.306 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Snapshot image upload complete
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.307 2 DEBUG nova.compute.manager [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.439 2 INFO nova.compute.manager [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shelve offloading
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.464 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.465 2 DEBUG nova.compute.manager [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.469 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.469 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.469 2 DEBUG nova.network.neutron [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.789 2 DEBUG nova.network.neutron [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:14 compute-0 nova_compute[192810]: 2025-09-30 21:25:14.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.268 2 DEBUG nova.network.neutron [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.292 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.298 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.299 2 DEBUG nova.objects.instance [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'resources' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.315 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deleting instance files /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.320 2 INFO nova.virt.libvirt.driver [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deletion of /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del complete
Sep 30 21:25:15 compute-0 sshd-session[227453]: Invalid user deploy from 45.81.23.80 port 57800
Sep 30 21:25:15 compute-0 sshd-session[227453]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:25:15 compute-0 sshd-session[227453]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.450 2 INFO nova.scheduler.client.report [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Deleted allocations for instance 2aa71932-03fb-4f75-b359-e1ff961fd8f6
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.517 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.518 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.551 2 DEBUG nova.compute.provider_tree [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.569 2 DEBUG nova.scheduler.client.report [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.597 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:15 compute-0 nova_compute[192810]: 2025-09-30 21:25:15.715 2 DEBUG oslo_concurrency.lockutils [None req-b2e719dc-21bf-47b2-8e8f-4cb57b9c92a1 bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:16 compute-0 nova_compute[192810]: 2025-09-30 21:25:16.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:16 compute-0 sshd-session[227453]: Failed password for invalid user deploy from 45.81.23.80 port 57800 ssh2
Sep 30 21:25:17 compute-0 sshd-session[227453]: Received disconnect from 45.81.23.80 port 57800:11: Bye Bye [preauth]
Sep 30 21:25:17 compute-0 sshd-session[227453]: Disconnected from invalid user deploy 45.81.23.80 port 57800 [preauth]
Sep 30 21:25:18 compute-0 nova_compute[192810]: 2025-09-30 21:25:18.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:18 compute-0 podman[227455]: 2025-09-30 21:25:18.338837755 +0000 UTC m=+0.063128476 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:25:18 compute-0 podman[227456]: 2025-09-30 21:25:18.355629461 +0000 UTC m=+0.077478601 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, architecture=x86_64)
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.746 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.746 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.747 2 INFO nova.compute.manager [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Unshelving
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.878 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.879 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.883 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'pci_requests' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.904 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'numa_topology' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.923 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:25:19 compute-0 nova_compute[192810]: 2025-09-30 21:25:19.923 2 INFO nova.compute.claims [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:25:20 compute-0 nova_compute[192810]: 2025-09-30 21:25:20.184 2 DEBUG nova.compute.provider_tree [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:20 compute-0 nova_compute[192810]: 2025-09-30 21:25:20.199 2 DEBUG nova.scheduler.client.report [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:20 compute-0 nova_compute[192810]: 2025-09-30 21:25:20.228 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:21 compute-0 nova_compute[192810]: 2025-09-30 21:25:21.447 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:21 compute-0 nova_compute[192810]: 2025-09-30 21:25:21.448 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:21 compute-0 nova_compute[192810]: 2025-09-30 21:25:21.449 2 DEBUG nova.network.neutron [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:21 compute-0 nova_compute[192810]: 2025-09-30 21:25:21.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:21 compute-0 nova_compute[192810]: 2025-09-30 21:25:21.878 2 DEBUG nova.network.neutron [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.174 2 DEBUG nova.network.neutron [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.198 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.200 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.200 2 INFO nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating image(s)
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.201 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.201 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.202 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.203 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.214 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:22 compute-0 nova_compute[192810]: 2025-09-30 21:25:22.215 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:23 compute-0 nova_compute[192810]: 2025-09-30 21:25:23.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:23 compute-0 nova_compute[192810]: 2025-09-30 21:25:23.430 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267508.426719, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:23 compute-0 nova_compute[192810]: 2025-09-30 21:25:23.430 2 INFO nova.compute.manager [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Stopped (Lifecycle Event)
Sep 30 21:25:23 compute-0 nova_compute[192810]: 2025-09-30 21:25:23.468 2 DEBUG nova.compute.manager [None req-510e924e-14a6-4b09-a220-713400c40e14 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.510 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.614 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.part --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.616 2 DEBUG nova.virt.images [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] ab266de5-f763-4820-9d1a-6d0e3bbb3d0a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.617 2 DEBUG nova.privsep.utils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.618 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.part /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.840 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.part /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.converted" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.854 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.930 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2.converted --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.932 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:24 compute-0 nova_compute[192810]: 2025-09-30 21:25:24.952 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.050 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.052 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.052 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.069 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.154 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.155 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.196 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2,backing_fmt=raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.197 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "edd4aa4c8decf0f5a78c1abc69848feb809aa9e2" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.198 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.250 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.251 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'migration_context' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.269 2 INFO nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Rebasing disk image.
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.269 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.327 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:25 compute-0 nova_compute[192810]: 2025-09-30 21:25:25.329 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:25 compute-0 podman[227524]: 2025-09-30 21:25:25.372503079 +0000 UTC m=+0.086034102 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, container_name=iscsid)
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.475 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk" returned: 0 in 1.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.476 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.476 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Ensure instance console log exists: /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.477 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.477 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.478 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.479 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='acf5914e6fcd838c548c5bec20bf328b',container_format='bare',created_at=2025-09-30T21:24:55Z,direct_url=<?>,disk_format='qcow2',id=ab266de5-f763-4820-9d1a-6d0e3bbb3d0a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1659572675-shelved',owner='1124d87211d04502bce5e44c25ed5d3c',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-09-30T21:25:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.484 2 WARNING nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.487 2 DEBUG nova.virt.libvirt.host [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.488 2 DEBUG nova.virt.libvirt.host [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.491 2 DEBUG nova.virt.libvirt.host [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.492 2 DEBUG nova.virt.libvirt.host [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.493 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.493 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='acf5914e6fcd838c548c5bec20bf328b',container_format='bare',created_at=2025-09-30T21:24:55Z,direct_url=<?>,disk_format='qcow2',id=ab266de5-f763-4820-9d1a-6d0e3bbb3d0a,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1659572675-shelved',owner='1124d87211d04502bce5e44c25ed5d3c',properties=ImageMetaProps,protected=<?>,size=52232192,status='active',tags=<?>,updated_at=2025-09-30T21:25:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.493 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.494 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.494 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.494 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.494 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.494 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.495 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.495 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.495 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.495 2 DEBUG nova.virt.hardware [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.495 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.528 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.552 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <uuid>2aa71932-03fb-4f75-b359-e1ff961fd8f6</uuid>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <name>instance-00000033</name>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1659572675</nova:name>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:25:26</nova:creationTime>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:user uuid="bd6d636573fd4d899b5e6f4a1f7e1790">tempest-UnshelveToHostMultiNodesTest-98679670-project-member</nova:user>
Sep 30 21:25:26 compute-0 nova_compute[192810]:         <nova:project uuid="1124d87211d04502bce5e44c25ed5d3c">tempest-UnshelveToHostMultiNodesTest-98679670</nova:project>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="ab266de5-f763-4820-9d1a-6d0e3bbb3d0a"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <system>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="serial">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="uuid">2aa71932-03fb-4f75-b359-e1ff961fd8f6</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </system>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <os>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </os>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <features>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </features>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/console.log" append="off"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <video>
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </video>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:25:26 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:25:26 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:25:26 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:25:26 compute-0 nova_compute[192810]: </domain>
Sep 30 21:25:26 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.644 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.645 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.645 2 INFO nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Using config drive
Sep 30 21:25:26 compute-0 podman[227553]: 2025-09-30 21:25:26.673558908 +0000 UTC m=+0.071716518 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:26 compute-0 podman[227555]: 2025-09-30 21:25:26.678297796 +0000 UTC m=+0.073997565 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.677 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.751 2 DEBUG nova.objects.instance [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lazy-loading 'keypairs' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:26 compute-0 nova_compute[192810]: 2025-09-30 21:25:26.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:27 compute-0 nova_compute[192810]: 2025-09-30 21:25:27.118 2 INFO nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Creating config drive at /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config
Sep 30 21:25:27 compute-0 nova_compute[192810]: 2025-09-30 21:25:27.123 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnnepp0dh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:27 compute-0 nova_compute[192810]: 2025-09-30 21:25:27.255 2 DEBUG oslo_concurrency.processutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnnepp0dh" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:27 compute-0 systemd-machined[152794]: New machine qemu-26-instance-00000033.
Sep 30 21:25:27 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000033.
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.324 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267528.324274, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.325 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Resumed (Lifecycle Event)
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.328 2 DEBUG nova.compute.manager [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.328 2 DEBUG nova.virt.libvirt.driver [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.332 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance spawned successfully.
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.348 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.352 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.385 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.386 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267528.3252695, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.386 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Started (Lifecycle Event)
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.411 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.414 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:28 compute-0 nova_compute[192810]: 2025-09-30 21:25:28.435 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:29 compute-0 nova_compute[192810]: 2025-09-30 21:25:29.512 2 DEBUG nova.compute.manager [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:29 compute-0 nova_compute[192810]: 2025-09-30 21:25:29.650 2 DEBUG oslo_concurrency.lockutils [None req-a4def9ca-4ace-4f6a-9093-8dddf535a25d 47a3c0c272fe46d9b3b28c8ba8eca104 4330c02c7d4c45149ea6d48a9cf52cf3 - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 9.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:31 compute-0 nova_compute[192810]: 2025-09-30 21:25:31.652 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:31 compute-0 nova_compute[192810]: 2025-09-30 21:25:31.653 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:31 compute-0 nova_compute[192810]: 2025-09-30 21:25:31.653 2 INFO nova.compute.manager [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shelving
Sep 30 21:25:31 compute-0 nova_compute[192810]: 2025-09-30 21:25:31.701 2 DEBUG nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:25:31 compute-0 nova_compute[192810]: 2025-09-30 21:25:31.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:33 compute-0 nova_compute[192810]: 2025-09-30 21:25:33.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:36 compute-0 podman[227626]: 2025-09-30 21:25:36.371926751 +0000 UTC m=+0.091254903 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:25:36 compute-0 podman[227625]: 2025-09-30 21:25:36.38806216 +0000 UTC m=+0.115478372 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:25:36 compute-0 nova_compute[192810]: 2025-09-30 21:25:36.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:37 compute-0 nova_compute[192810]: 2025-09-30 21:25:37.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:38 compute-0 nova_compute[192810]: 2025-09-30 21:25:38.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:38.730 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:38.730 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:38.730 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:39 compute-0 nova_compute[192810]: 2025-09-30 21:25:39.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:41 compute-0 nova_compute[192810]: 2025-09-30 21:25:41.745 2 DEBUG nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:25:41 compute-0 nova_compute[192810]: 2025-09-30 21:25:41.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:41 compute-0 nova_compute[192810]: 2025-09-30 21:25:41.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:42 compute-0 podman[227681]: 2025-09-30 21:25:42.365650196 +0000 UTC m=+0.095127018 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:25:43 compute-0 nova_compute[192810]: 2025-09-30 21:25:43.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:43 compute-0 nova_compute[192810]: 2025-09-30 21:25:43.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:43 compute-0 nova_compute[192810]: 2025-09-30 21:25:43.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:43 compute-0 nova_compute[192810]: 2025-09-30 21:25:43.802 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:43 compute-0 nova_compute[192810]: 2025-09-30 21:25:43.802 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:25:43 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000033.scope: Deactivated successfully.
Sep 30 21:25:43 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000033.scope: Consumed 13.774s CPU time.
Sep 30 21:25:43 compute-0 systemd-machined[152794]: Machine qemu-26-instance-00000033 terminated.
Sep 30 21:25:44 compute-0 nova_compute[192810]: 2025-09-30 21:25:44.759 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance shutdown successfully after 13 seconds.
Sep 30 21:25:44 compute-0 nova_compute[192810]: 2025-09-30 21:25:44.765 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:44 compute-0 nova_compute[192810]: 2025-09-30 21:25:44.765 2 DEBUG nova.objects.instance [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'numa_topology' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:44 compute-0 nova_compute[192810]: 2025-09-30 21:25:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.213 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Beginning cold snapshot process
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.404 2 DEBUG nova.privsep.utils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.405 2 DEBUG oslo_concurrency.processutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk /var/lib/nova/instances/snapshots/tmp82fxyu2c/27d93ab6ecc848a8a40f99409ac82a3d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.802 2 DEBUG oslo_concurrency.processutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk /var/lib/nova/instances/snapshots/tmp82fxyu2c/27d93ab6ecc848a8a40f99409ac82a3d" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.803 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Snapshot extracted, beginning image upload
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.835 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.836 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.836 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.837 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:25:45 compute-0 nova_compute[192810]: 2025-09-30 21:25:45.922 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.009 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.011 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.092 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.267 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.268 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.2834358215332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.269 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.269 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.423 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 2aa71932-03fb-4f75-b359-e1ff961fd8f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.424 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.424 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.483 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.499 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.527 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.527 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:46 compute-0 nova_compute[192810]: 2025-09-30 21:25:46.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.528 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.528 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.529 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.554 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.555 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.555 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.555 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:47 compute-0 nova_compute[192810]: 2025-09-30 21:25:47.785 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.521 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.552 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.553 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.993 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Snapshot image upload complete
Sep 30 21:25:48 compute-0 nova_compute[192810]: 2025-09-30 21:25:48.994 2 DEBUG nova.compute.manager [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.109 2 INFO nova.compute.manager [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Shelve offloading
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.143 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.143 2 DEBUG nova.compute.manager [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.145 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.146 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquired lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.146 2 DEBUG nova.network.neutron [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:49 compute-0 podman[227726]: 2025-09-30 21:25:49.312303504 +0000 UTC m=+0.050043300 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Sep 30 21:25:49 compute-0 podman[227725]: 2025-09-30 21:25:49.332220096 +0000 UTC m=+0.072956846 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:25:49 compute-0 nova_compute[192810]: 2025-09-30 21:25:49.460 2 DEBUG nova.network.neutron [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.459 2 DEBUG nova.network.neutron [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.494 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Releasing lock "refresh_cache-2aa71932-03fb-4f75-b359-e1ff961fd8f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.503 2 INFO nova.virt.libvirt.driver [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Instance destroyed successfully.
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.504 2 DEBUG nova.objects.instance [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lazy-loading 'resources' on Instance uuid 2aa71932-03fb-4f75-b359-e1ff961fd8f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.526 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deleting instance files /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.536 2 INFO nova.virt.libvirt.driver [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Deletion of /var/lib/nova/instances/2aa71932-03fb-4f75-b359-e1ff961fd8f6_del complete
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.718 2 INFO nova.scheduler.client.report [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Deleted allocations for instance 2aa71932-03fb-4f75-b359-e1ff961fd8f6
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.799 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.800 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.857 2 DEBUG nova.compute.provider_tree [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.872 2 DEBUG nova.scheduler.client.report [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.901 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.957 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.958 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.979 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:25:50 compute-0 nova_compute[192810]: 2025-09-30 21:25:50.984 2 DEBUG oslo_concurrency.lockutils [None req-5de4c99c-d048-4792-9d03-ef275b660bfa bd6d636573fd4d899b5e6f4a1f7e1790 1124d87211d04502bce5e44c25ed5d3c - - default default] Lock "2aa71932-03fb-4f75-b359-e1ff961fd8f6" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 19.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.078 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.079 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.089 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.090 2 INFO nova.compute.claims [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.258 2 DEBUG nova.compute.provider_tree [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.273 2 DEBUG nova.scheduler.client.report [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.310 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.311 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.364 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.365 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.381 2 INFO nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.403 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.557 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.558 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.558 2 INFO nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Creating image(s)
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.559 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.559 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.559 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.572 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.652 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.653 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.654 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.672 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.706 2 DEBUG nova.policy [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfe43dba9d03417182dd245d360568e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.733 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.734 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.804 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk 1073741824" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.805 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.805 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.867 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.869 2 DEBUG nova.virt.disk.api [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Checking if we can resize image /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.869 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.932 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.933 2 DEBUG nova.virt.disk.api [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Cannot resize image /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.933 2 DEBUG nova.objects.instance [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'migration_context' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.961 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.962 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Ensure instance console log exists: /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.962 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.963 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:51 compute-0 nova_compute[192810]: 2025-09-30 21:25:51.963 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:53 compute-0 nova_compute[192810]: 2025-09-30 21:25:53.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:53 compute-0 nova_compute[192810]: 2025-09-30 21:25:53.517 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Successfully created port: 70dc125e-319e-4e53-8804-bf4802e8790e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.075 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Successfully updated port: 70dc125e-319e-4e53-8804-bf4802e8790e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.154 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.154 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.155 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.484 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.968 2 DEBUG nova.compute.manager [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-changed-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.969 2 DEBUG nova.compute.manager [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Refreshing instance network info cache due to event network-changed-70dc125e-319e-4e53-8804-bf4802e8790e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:25:55 compute-0 nova_compute[192810]: 2025-09-30 21:25:55.969 2 DEBUG oslo_concurrency.lockutils [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:25:56 compute-0 podman[227786]: 2025-09-30 21:25:56.372113795 +0000 UTC m=+0.088587132 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:25:56 compute-0 nova_compute[192810]: 2025-09-30 21:25:56.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-0 podman[227806]: 2025-09-30 21:25:57.335854878 +0000 UTC m=+0.072449745 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:25:57 compute-0 podman[227807]: 2025-09-30 21:25:57.359627007 +0000 UTC m=+0.083962956 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.426 2 DEBUG nova.network.neutron [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Updating instance_info_cache with network_info: [{"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.452 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.453 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance network_info: |[{"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.453 2 DEBUG oslo_concurrency.lockutils [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.454 2 DEBUG nova.network.neutron [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Refreshing network info cache for port 70dc125e-319e-4e53-8804-bf4802e8790e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.460 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Start _get_guest_xml network_info=[{"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.466 2 WARNING nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.470 2 DEBUG nova.virt.libvirt.host [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.471 2 DEBUG nova.virt.libvirt.host [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.478 2 DEBUG nova.virt.libvirt.host [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.478 2 DEBUG nova.virt.libvirt.host [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.479 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.479 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.480 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.481 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.481 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.481 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.481 2 DEBUG nova.virt.hardware [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.484 2 DEBUG nova.virt.libvirt.vif [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-278828185',display_name='tempest-DeleteServersTestJSON-server-278828185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-278828185',id=57,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-dern13xv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:51Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=be420b9b-7858-47b5-93b5-4c99f93efeed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.484 2 DEBUG nova.network.os_vif_util [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.485 2 DEBUG nova.network.os_vif_util [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.486 2 DEBUG nova.objects.instance [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.503 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <uuid>be420b9b-7858-47b5-93b5-4c99f93efeed</uuid>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <name>instance-00000039</name>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:name>tempest-DeleteServersTestJSON-server-278828185</nova:name>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:25:57</nova:creationTime>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:user uuid="bfe43dba9d03417182dd245d360568e6">tempest-DeleteServersTestJSON-314554874-project-member</nova:user>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:project uuid="c4bb94b19ac546f195f1f1f35411cce9">tempest-DeleteServersTestJSON-314554874</nova:project>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         <nova:port uuid="70dc125e-319e-4e53-8804-bf4802e8790e">
Sep 30 21:25:57 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <system>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="serial">be420b9b-7858-47b5-93b5-4c99f93efeed</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="uuid">be420b9b-7858-47b5-93b5-4c99f93efeed</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </system>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <os>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </os>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <features>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </features>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.config"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:d3:56:37"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <target dev="tap70dc125e-31"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/console.log" append="off"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <video>
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </video>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:25:57 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:25:57 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:25:57 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:25:57 compute-0 nova_compute[192810]: </domain>
Sep 30 21:25:57 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.503 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Preparing to wait for external event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.503 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.503 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.504 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.504 2 DEBUG nova.virt.libvirt.vif [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-278828185',display_name='tempest-DeleteServersTestJSON-server-278828185',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-278828185',id=57,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-dern13xv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:25:51Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=be420b9b-7858-47b5-93b5-4c99f93efeed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.504 2 DEBUG nova.network.os_vif_util [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.505 2 DEBUG nova.network.os_vif_util [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.505 2 DEBUG os_vif [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.506 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.506 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70dc125e-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap70dc125e-31, col_values=(('external_ids', {'iface-id': '70dc125e-319e-4e53-8804-bf4802e8790e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:56:37', 'vm-uuid': 'be420b9b-7858-47b5-93b5-4c99f93efeed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-0 NetworkManager[51733]: <info>  [1759267557.5113] manager: (tap70dc125e-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.521 2 INFO os_vif [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31')
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.621 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.622 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.622 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No VIF found with MAC fa:16:3e:d3:56:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:25:57 compute-0 nova_compute[192810]: 2025-09-30 21:25:57.622 2 INFO nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Using config drive
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.236 2 INFO nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Creating config drive at /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.config
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.247 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgpolunlj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.393 2 DEBUG oslo_concurrency.processutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgpolunlj" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:25:58 compute-0 kernel: tap70dc125e-31: entered promiscuous mode
Sep 30 21:25:58 compute-0 NetworkManager[51733]: <info>  [1759267558.4824] manager: (tap70dc125e-31): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Sep 30 21:25:58 compute-0 systemd-udevd[227869]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:58 compute-0 ovn_controller[94912]: 2025-09-30T21:25:58Z|00199|binding|INFO|Claiming lport 70dc125e-319e-4e53-8804-bf4802e8790e for this chassis.
Sep 30 21:25:58 compute-0 ovn_controller[94912]: 2025-09-30T21:25:58Z|00200|binding|INFO|70dc125e-319e-4e53-8804-bf4802e8790e: Claiming fa:16:3e:d3:56:37 10.100.0.4
Sep 30 21:25:58 compute-0 NetworkManager[51733]: <info>  [1759267558.5466] device (tap70dc125e-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:25:58 compute-0 NetworkManager[51733]: <info>  [1759267558.5476] device (tap70dc125e-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:25:58 compute-0 systemd-machined[152794]: New machine qemu-27-instance-00000039.
Sep 30 21:25:58 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000039.
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:58 compute-0 ovn_controller[94912]: 2025-09-30T21:25:58Z|00201|binding|INFO|Setting lport 70dc125e-319e-4e53-8804-bf4802e8790e ovn-installed in OVS
Sep 30 21:25:58 compute-0 nova_compute[192810]: 2025-09-30 21:25:58.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.752 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:56:37 10.100.0.4'], port_security=['fa:16:3e:d3:56:37 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'be420b9b-7858-47b5-93b5-4c99f93efeed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=70dc125e-319e-4e53-8804-bf4802e8790e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.753 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 70dc125e-319e-4e53-8804-bf4802e8790e in datapath 5569112a-9fb3-4151-add0-95b595cbe309 bound to our chassis
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.754 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:58 compute-0 ovn_controller[94912]: 2025-09-30T21:25:58Z|00202|binding|INFO|Setting lport 70dc125e-319e-4e53-8804-bf4802e8790e up in Southbound
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.770 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7d916826-235a-441b-a73c-79776a3573cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.771 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5569112a-91 in ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.773 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5569112a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.774 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a7514772-4dce-4136-a0d0-056980c3efa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.774 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a83c0633-d362-41da-a99e-79444b285ee0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.797 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[8dfc51f8-eaa0-4194-9b2a-78f71956e2df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.813 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[db989a7d-2b40-4b3a-baf2-3062dd068b7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.853 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d5de60e1-0670-4a34-a8f8-eceaadbc2922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 NetworkManager[51733]: <info>  [1759267558.8658] manager: (tap5569112a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.863 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[74653fba-37fe-4e4f-b5b6-114b7f2e5ec0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 systemd-udevd[227873]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.910 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0c052a09-85ed-47d7-af9c-5090c6582149]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.916 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ff78fc75-f973-4c9a-87f4-0bed2782cdd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 NetworkManager[51733]: <info>  [1759267558.9459] device (tap5569112a-90): carrier: link connected
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.954 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fddc9936-d470-454d-a79b-bd02d284eeb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:58.986 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc1074a-69a5-4027-aa28-c5bba164906f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425457, 'reachable_time': 37273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227914, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.012 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[22933dc7-5d4b-4c8d-925c-19428a440bd4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425457, 'tstamp': 425457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227915, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.037 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[87df6414-f4ef-4186-807a-005f5cf4a8bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425457, 'reachable_time': 37273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227916, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.075 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[46eee2d3-f1df-4553-bb4a-a9f9acd444c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.124 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267544.123345, 2aa71932-03fb-4f75-b359-e1ff961fd8f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.124 2 INFO nova.compute.manager [-] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] VM Stopped (Lifecycle Event)
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.146 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e9ce67-93c2-4638-816c-877b2ec8dc29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.148 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.148 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.148 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569112a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-0 NetworkManager[51733]: <info>  [1759267559.1514] manager: (tap5569112a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Sep 30 21:25:59 compute-0 kernel: tap5569112a-90: entered promiscuous mode
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.155 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5569112a-90, col_values=(('external_ids', {'iface-id': 'af49dc17-c7c9-4524-8791-14107f2ff34d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-0 ovn_controller[94912]: 2025-09-30T21:25:59Z|00203|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.159 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.165 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[47490cfd-aa32-4c56-b26c-ea4b261a0ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.165 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:25:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:25:59.166 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'env', 'PROCESS_TAG=haproxy-5569112a-9fb3-4151-add0-95b595cbe309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5569112a-9fb3-4151-add0-95b595cbe309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.341 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267559.341106, be420b9b-7858-47b5-93b5-4c99f93efeed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.342 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] VM Started (Lifecycle Event)
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.435 2 DEBUG nova.compute.manager [None req-30af035f-9279-49e3-8c95-afa738ba4781 - - - - - -] [instance: 2aa71932-03fb-4f75-b359-e1ff961fd8f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.436 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.442 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267559.3412104, be420b9b-7858-47b5-93b5-4c99f93efeed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.443 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] VM Paused (Lifecycle Event)
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.480 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.483 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:25:59 compute-0 nova_compute[192810]: 2025-09-30 21:25:59.536 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:25:59 compute-0 podman[227948]: 2025-09-30 21:25:59.538206177 +0000 UTC m=+0.059667244 container create c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:25:59 compute-0 systemd[1]: Started libpod-conmon-c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5.scope.
Sep 30 21:25:59 compute-0 podman[227948]: 2025-09-30 21:25:59.505695658 +0000 UTC m=+0.027156825 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:25:59 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:25:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/053e4d1ba9e70d805e0cb23881010fe88909edc4e1c7fd6c2e56410c3f02ee70/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:25:59 compute-0 podman[227948]: 2025-09-30 21:25:59.615764671 +0000 UTC m=+0.137225738 container init c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:25:59 compute-0 podman[227948]: 2025-09-30 21:25:59.621056224 +0000 UTC m=+0.142517291 container start c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:25:59 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [NOTICE]   (227967) : New worker (227969) forked
Sep 30 21:25:59 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [NOTICE]   (227967) : Loading success.
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.327 2 DEBUG nova.compute.manager [req-ed77aae3-dfc3-426d-a70f-8b02d6fc4d39 req-8e1652e1-9d9a-4ccb-8dd7-50ed68632f05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.328 2 DEBUG oslo_concurrency.lockutils [req-ed77aae3-dfc3-426d-a70f-8b02d6fc4d39 req-8e1652e1-9d9a-4ccb-8dd7-50ed68632f05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.328 2 DEBUG oslo_concurrency.lockutils [req-ed77aae3-dfc3-426d-a70f-8b02d6fc4d39 req-8e1652e1-9d9a-4ccb-8dd7-50ed68632f05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.329 2 DEBUG oslo_concurrency.lockutils [req-ed77aae3-dfc3-426d-a70f-8b02d6fc4d39 req-8e1652e1-9d9a-4ccb-8dd7-50ed68632f05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.329 2 DEBUG nova.compute.manager [req-ed77aae3-dfc3-426d-a70f-8b02d6fc4d39 req-8e1652e1-9d9a-4ccb-8dd7-50ed68632f05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Processing event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.330 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.334 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267560.3340585, be420b9b-7858-47b5-93b5-4c99f93efeed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.334 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] VM Resumed (Lifecycle Event)
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.336 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.340 2 INFO nova.virt.libvirt.driver [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance spawned successfully.
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.341 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.362 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.363 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.363 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.364 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.364 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.365 2 DEBUG nova.virt.libvirt.driver [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.392 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.396 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.437 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.483 2 INFO nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Took 8.93 seconds to spawn the instance on the hypervisor.
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.484 2 DEBUG nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.618 2 INFO nova.compute.manager [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Took 9.57 seconds to build instance.
Sep 30 21:26:00 compute-0 nova_compute[192810]: 2025-09-30 21:26:00.657 2 DEBUG oslo_concurrency.lockutils [None req-b2a60ff8-b2fd-432d-b2f9-8100a9f31a84 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.166 2 DEBUG nova.network.neutron [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Updated VIF entry in instance network info cache for port 70dc125e-319e-4e53-8804-bf4802e8790e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.167 2 DEBUG nova.network.neutron [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Updating instance_info_cache with network_info: [{"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.192 2 DEBUG oslo_concurrency.lockutils [req-045aca79-8c4e-4d40-9a7c-d89c54e47c0e req-4b7f2447-689d-4293-b8e1-088fce72c2d5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-be420b9b-7858-47b5-93b5-4c99f93efeed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.754 2 DEBUG oslo_concurrency.lockutils [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.755 2 DEBUG oslo_concurrency.lockutils [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.755 2 DEBUG nova.compute.manager [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.759 2 DEBUG nova.compute.manager [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.760 2 DEBUG nova.objects.instance [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'flavor' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.803 2 DEBUG nova.objects.instance [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'info_cache' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.835 2 DEBUG nova.virt.libvirt.driver [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:26:01 compute-0 nova_compute[192810]: 2025-09-30 21:26:01.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.840 2 DEBUG nova.compute.manager [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.840 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.841 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.841 2 DEBUG oslo_concurrency.lockutils [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.841 2 DEBUG nova.compute.manager [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] No waiting events found dispatching network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:02 compute-0 nova_compute[192810]: 2025-09-30 21:26:02.842 2 WARNING nova.compute.manager [req-a0e10b6f-9d9b-4a47-9427-45787ccfba92 req-12fe7068-9573-4c94-bad1-3f3087bdff7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received unexpected event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e for instance with vm_state active and task_state powering-off.
Sep 30 21:26:06 compute-0 nova_compute[192810]: 2025-09-30 21:26:06.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:07 compute-0 podman[227979]: 2025-09-30 21:26:07.343685579 +0000 UTC m=+0.072058536 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Sep 30 21:26:07 compute-0 podman[227978]: 2025-09-30 21:26:07.368550015 +0000 UTC m=+0.097314882 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:26:07 compute-0 nova_compute[192810]: 2025-09-30 21:26:07.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:11 compute-0 unix_chkpwd[228035]: password check failed for user (root)
Sep 30 21:26:11 compute-0 sshd-session[228025]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:26:11 compute-0 nova_compute[192810]: 2025-09-30 21:26:11.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:11 compute-0 nova_compute[192810]: 2025-09-30 21:26:11.885 2 DEBUG nova.virt.libvirt.driver [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:26:12 compute-0 ovn_controller[94912]: 2025-09-30T21:26:12Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:56:37 10.100.0.4
Sep 30 21:26:12 compute-0 ovn_controller[94912]: 2025-09-30T21:26:12Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:56:37 10.100.0.4
Sep 30 21:26:12 compute-0 nova_compute[192810]: 2025-09-30 21:26:12.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:13 compute-0 podman[228042]: 2025-09-30 21:26:13.333175212 +0000 UTC m=+0.057769286 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:26:13 compute-0 sshd-session[228025]: Failed password for root from 45.81.23.80 port 52642 ssh2
Sep 30 21:26:14 compute-0 sshd-session[228025]: Received disconnect from 45.81.23.80 port 52642:11: Bye Bye [preauth]
Sep 30 21:26:14 compute-0 sshd-session[228025]: Disconnected from authenticating user root 45.81.23.80 port 52642 [preauth]
Sep 30 21:26:14 compute-0 kernel: tap70dc125e-31 (unregistering): left promiscuous mode
Sep 30 21:26:14 compute-0 NetworkManager[51733]: <info>  [1759267574.9478] device (tap70dc125e-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-0 ovn_controller[94912]: 2025-09-30T21:26:15Z|00204|binding|INFO|Releasing lport 70dc125e-319e-4e53-8804-bf4802e8790e from this chassis (sb_readonly=0)
Sep 30 21:26:15 compute-0 ovn_controller[94912]: 2025-09-30T21:26:15Z|00205|binding|INFO|Setting lport 70dc125e-319e-4e53-8804-bf4802e8790e down in Southbound
Sep 30 21:26:15 compute-0 ovn_controller[94912]: 2025-09-30T21:26:15Z|00206|binding|INFO|Removing iface tap70dc125e-31 ovn-installed in OVS
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.032 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:56:37 10.100.0.4'], port_security=['fa:16:3e:d3:56:37 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'be420b9b-7858-47b5-93b5-4c99f93efeed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=70dc125e-319e-4e53-8804-bf4802e8790e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.033 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 70dc125e-319e-4e53-8804-bf4802e8790e in datapath 5569112a-9fb3-4151-add0-95b595cbe309 unbound from our chassis
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.034 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5569112a-9fb3-4151-add0-95b595cbe309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.035 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[54084f63-24ae-48d5-9e47-88ee4bf3fa7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.036 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace which is not needed anymore
Sep 30 21:26:15 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000039.scope: Deactivated successfully.
Sep 30 21:26:15 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000039.scope: Consumed 12.523s CPU time.
Sep 30 21:26:15 compute-0 systemd-machined[152794]: Machine qemu-27-instance-00000039 terminated.
Sep 30 21:26:15 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [NOTICE]   (227967) : haproxy version is 2.8.14-c23fe91
Sep 30 21:26:15 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [NOTICE]   (227967) : path to executable is /usr/sbin/haproxy
Sep 30 21:26:15 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [WARNING]  (227967) : Exiting Master process...
Sep 30 21:26:15 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [ALERT]    (227967) : Current worker (227969) exited with code 143 (Terminated)
Sep 30 21:26:15 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[227963]: [WARNING]  (227967) : All workers exited. Exiting... (0)
Sep 30 21:26:15 compute-0 systemd[1]: libpod-c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5.scope: Deactivated successfully.
Sep 30 21:26:15 compute-0 podman[228086]: 2025-09-30 21:26:15.161410819 +0000 UTC m=+0.047252461 container died c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:26:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5-userdata-shm.mount: Deactivated successfully.
Sep 30 21:26:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-053e4d1ba9e70d805e0cb23881010fe88909edc4e1c7fd6c2e56410c3f02ee70-merged.mount: Deactivated successfully.
Sep 30 21:26:15 compute-0 podman[228086]: 2025-09-30 21:26:15.223135714 +0000 UTC m=+0.108977356 container cleanup c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:26:15 compute-0 systemd[1]: libpod-conmon-c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5.scope: Deactivated successfully.
Sep 30 21:26:15 compute-0 podman[228118]: 2025-09-30 21:26:15.290128641 +0000 UTC m=+0.047055046 container remove c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.294 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[336a3b1f-c389-4610-8873-0451163e7cff]: (4, ('Tue Sep 30 09:26:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5)\nc5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5\nTue Sep 30 09:26:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (c5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5)\nc5d5fc43d2cada5048c6400ebf09eefe6f77bb6ba5a284ab34926bae962393b5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.296 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2e304f3a-0c1c-4213-a13a-5bb9c1a4c44d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.297 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-0 kernel: tap5569112a-90: left promiscuous mode
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.316 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4031df2b-5f60-47df-bcb9-39bf1926ca92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.346 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cc14ea0e-e064-4ac1-86dc-b69aff6c7dac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.347 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e56c88-5544-4c63-b43b-bbf4c6e2e1f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.364 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf7ed90-79d0-4cda-bc32-07bc28f67959]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425446, 'reachable_time': 34423, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228154, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.367 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:26:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:15.368 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[9e500f0e-2ecc-4c30-9325-112965bce320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d5569112a\x2d9fb3\x2d4151\x2dadd0\x2d95b595cbe309.mount: Deactivated successfully.
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.902 2 INFO nova.virt.libvirt.driver [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance shutdown successfully after 14 seconds.
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.909 2 INFO nova.virt.libvirt.driver [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance destroyed successfully.
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.909 2 DEBUG nova.objects.instance [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'numa_topology' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:15 compute-0 nova_compute[192810]: 2025-09-30 21:26:15.926 2 DEBUG nova.compute.manager [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.032 2 DEBUG oslo_concurrency.lockutils [None req-4704e269-98bf-431f-94dd-79cc1cc67744 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 14.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.501 2 DEBUG nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-vif-unplugged-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.501 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.502 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.502 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.502 2 DEBUG nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] No waiting events found dispatching network-vif-unplugged-70dc125e-319e-4e53-8804-bf4802e8790e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.502 2 WARNING nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received unexpected event network-vif-unplugged-70dc125e-319e-4e53-8804-bf4802e8790e for instance with vm_state stopped and task_state None.
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.503 2 DEBUG nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.503 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.503 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.503 2 DEBUG oslo_concurrency.lockutils [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.504 2 DEBUG nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] No waiting events found dispatching network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.504 2 WARNING nova.compute.manager [req-4a5b4899-3bcf-493e-aec1-89db70be8101 req-f7f9ae18-c029-4d3c-ac17-bde0bbd1ce80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received unexpected event network-vif-plugged-70dc125e-319e-4e53-8804-bf4802e8790e for instance with vm_state stopped and task_state None.
Sep 30 21:26:16 compute-0 nova_compute[192810]: 2025-09-30 21:26:16.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:17 compute-0 nova_compute[192810]: 2025-09-30 21:26:17.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.314 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.314 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.315 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.315 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.315 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.324 2 INFO nova.compute.manager [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Terminating instance
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.333 2 DEBUG nova.compute.manager [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.340 2 INFO nova.virt.libvirt.driver [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Instance destroyed successfully.
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.340 2 DEBUG nova.objects.instance [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'resources' on Instance uuid be420b9b-7858-47b5-93b5-4c99f93efeed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.354 2 DEBUG nova.virt.libvirt.vif [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:25:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-278828185',display_name='tempest-DeleteServersTestJSON-server-278828185',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-278828185',id=57,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:26:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-dern13xv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:15Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=be420b9b-7858-47b5-93b5-4c99f93efeed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.355 2 DEBUG nova.network.os_vif_util [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "70dc125e-319e-4e53-8804-bf4802e8790e", "address": "fa:16:3e:d3:56:37", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap70dc125e-31", "ovs_interfaceid": "70dc125e-319e-4e53-8804-bf4802e8790e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.355 2 DEBUG nova.network.os_vif_util [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.356 2 DEBUG os_vif [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.358 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70dc125e-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.370 2 INFO os_vif [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:56:37,bridge_name='br-int',has_traffic_filtering=True,id=70dc125e-319e-4e53-8804-bf4802e8790e,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap70dc125e-31')
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.370 2 INFO nova.virt.libvirt.driver [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Deleting instance files /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed_del
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.371 2 INFO nova.virt.libvirt.driver [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Deletion of /var/lib/nova/instances/be420b9b-7858-47b5-93b5-4c99f93efeed_del complete
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.465 2 INFO nova.compute.manager [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Took 0.13 seconds to destroy the instance on the hypervisor.
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.466 2 DEBUG oslo.service.loopingcall [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.466 2 DEBUG nova.compute.manager [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:26:18 compute-0 nova_compute[192810]: 2025-09-30 21:26:18.467 2 DEBUG nova.network.neutron [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.589 2 DEBUG nova.network.neutron [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.606 2 INFO nova.compute.manager [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Took 1.14 seconds to deallocate network for instance.
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.686 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.686 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.777 2 DEBUG nova.compute.provider_tree [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.804 2 DEBUG nova.scheduler.client.report [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.832 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.877 2 INFO nova.scheduler.client.report [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Deleted allocations for instance be420b9b-7858-47b5-93b5-4c99f93efeed
Sep 30 21:26:19 compute-0 nova_compute[192810]: 2025-09-30 21:26:19.955 2 DEBUG oslo_concurrency.lockutils [None req-cbe22f85-bf44-4a27-a10d-5a591e3b8cfc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "be420b9b-7858-47b5-93b5-4c99f93efeed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.641s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:20 compute-0 nova_compute[192810]: 2025-09-30 21:26:20.308 2 DEBUG nova.compute.manager [req-649ef900-efbe-496d-80e9-b84c420c6d41 req-26bfdd4e-8eb0-45ac-b511-2afa1814f69c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Received event network-vif-deleted-70dc125e-319e-4e53-8804-bf4802e8790e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:20 compute-0 podman[228157]: 2025-09-30 21:26:20.339680521 +0000 UTC m=+0.061788857 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:26:20 compute-0 podman[228156]: 2025-09-30 21:26:20.347379055 +0000 UTC m=+0.083813872 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:26:21 compute-0 nova_compute[192810]: 2025-09-30 21:26:21.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-0 nova_compute[192810]: 2025-09-30 21:26:23.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:23.872 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:23.873 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:26:23 compute-0 nova_compute[192810]: 2025-09-30 21:26:23.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.016 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.016 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.036 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.152 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.153 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.159 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.159 2 INFO nova.compute.claims [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.306 2 DEBUG nova.compute.provider_tree [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.326 2 DEBUG nova.scheduler.client.report [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.353 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.354 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.501 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.501 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.527 2 INFO nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.542 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.710 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.713 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.714 2 INFO nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Creating image(s)
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.715 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.715 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.716 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.740 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.827 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.828 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.830 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.851 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:24.875 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.934 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.936 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.984 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.986 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:24 compute-0 nova_compute[192810]: 2025-09-30 21:26:24.987 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.052 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.053 2 DEBUG nova.virt.disk.api [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Checking if we can resize image /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.054 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.113 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.115 2 DEBUG nova.virt.disk.api [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Cannot resize image /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.115 2 DEBUG nova.objects.instance [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'migration_context' on Instance uuid 87d0ed60-e235-45c5-ae23-1e76a970c679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.149 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.149 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Ensure instance console log exists: /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.150 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.150 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.151 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:25 compute-0 nova_compute[192810]: 2025-09-30 21:26:25.365 2 DEBUG nova.policy [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfe43dba9d03417182dd245d360568e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:26:26 compute-0 nova_compute[192810]: 2025-09-30 21:26:26.427 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Successfully created port: 048dc651-e105-4eae-b4c6-92b99b32cd2f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:26:26 compute-0 nova_compute[192810]: 2025-09-30 21:26:26.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.273 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Successfully updated port: 048dc651-e105-4eae-b4c6-92b99b32cd2f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.291 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.292 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.292 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:27 compute-0 podman[228216]: 2025-09-30 21:26:27.356579932 +0000 UTC m=+0.078275113 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid)
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.394 2 DEBUG nova.compute.manager [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-changed-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.395 2 DEBUG nova.compute.manager [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Refreshing instance network info cache due to event network-changed-048dc651-e105-4eae-b4c6-92b99b32cd2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.395 2 DEBUG oslo_concurrency.lockutils [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:27 compute-0 podman[228236]: 2025-09-30 21:26:27.444346382 +0000 UTC m=+0.053336154 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:26:27 compute-0 podman[228235]: 2025-09-30 21:26:27.448391884 +0000 UTC m=+0.062137316 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Sep 30 21:26:27 compute-0 nova_compute[192810]: 2025-09-30 21:26:27.480 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:26:28 compute-0 nova_compute[192810]: 2025-09-30 21:26:28.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.856 2 DEBUG nova.network.neutron [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Updating instance_info_cache with network_info: [{"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.945 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.945 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Instance network_info: |[{"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.946 2 DEBUG oslo_concurrency.lockutils [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.946 2 DEBUG nova.network.neutron [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Refreshing network info cache for port 048dc651-e105-4eae-b4c6-92b99b32cd2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.948 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Start _get_guest_xml network_info=[{"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.952 2 WARNING nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.956 2 DEBUG nova.virt.libvirt.host [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.957 2 DEBUG nova.virt.libvirt.host [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.960 2 DEBUG nova.virt.libvirt.host [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.960 2 DEBUG nova.virt.libvirt.host [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.961 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.961 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.962 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.962 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.962 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.963 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.963 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.963 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.963 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.964 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.964 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.964 2 DEBUG nova.virt.hardware [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.968 2 DEBUG nova.virt.libvirt.vif [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1016171574',display_name='tempest-DeleteServersTestJSON-server-1016171574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1016171574',id=59,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-ps02fatk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:24Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=87d0ed60-e235-45c5-ae23-1e76a970c679,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.968 2 DEBUG nova.network.os_vif_util [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.969 2 DEBUG nova.network.os_vif_util [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:29 compute-0 nova_compute[192810]: 2025-09-30 21:26:29.970 2 DEBUG nova.objects.instance [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87d0ed60-e235-45c5-ae23-1e76a970c679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.038 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <uuid>87d0ed60-e235-45c5-ae23-1e76a970c679</uuid>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <name>instance-0000003b</name>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:name>tempest-DeleteServersTestJSON-server-1016171574</nova:name>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:26:29</nova:creationTime>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:user uuid="bfe43dba9d03417182dd245d360568e6">tempest-DeleteServersTestJSON-314554874-project-member</nova:user>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:project uuid="c4bb94b19ac546f195f1f1f35411cce9">tempest-DeleteServersTestJSON-314554874</nova:project>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         <nova:port uuid="048dc651-e105-4eae-b4c6-92b99b32cd2f">
Sep 30 21:26:30 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <system>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="serial">87d0ed60-e235-45c5-ae23-1e76a970c679</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="uuid">87d0ed60-e235-45c5-ae23-1e76a970c679</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </system>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <os>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </os>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <features>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </features>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.config"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:fd:f5:8a"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <target dev="tap048dc651-e1"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/console.log" append="off"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <video>
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </video>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:26:30 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:26:30 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:26:30 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:26:30 compute-0 nova_compute[192810]: </domain>
Sep 30 21:26:30 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.039 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Preparing to wait for external event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.039 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.039 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.040 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.040 2 DEBUG nova.virt.libvirt.vif [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1016171574',display_name='tempest-DeleteServersTestJSON-server-1016171574',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1016171574',id=59,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-ps02fatk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:24Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=87d0ed60-e235-45c5-ae23-1e76a970c679,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.041 2 DEBUG nova.network.os_vif_util [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.041 2 DEBUG nova.network.os_vif_util [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.041 2 DEBUG os_vif [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.042 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap048dc651-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.046 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap048dc651-e1, col_values=(('external_ids', {'iface-id': '048dc651-e105-4eae-b4c6-92b99b32cd2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:f5:8a', 'vm-uuid': '87d0ed60-e235-45c5-ae23-1e76a970c679'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.0482] manager: (tap048dc651-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.053 2 INFO os_vif [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1')
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.118 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.119 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.119 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No VIF found with MAC fa:16:3e:fd:f5:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.120 2 INFO nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Using config drive
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.281 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267575.2801514, be420b9b-7858-47b5-93b5-4c99f93efeed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.282 2 INFO nova.compute.manager [-] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] VM Stopped (Lifecycle Event)
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.304 2 DEBUG nova.compute.manager [None req-73bc46c8-00f2-48ab-96fe-cdd0d418a760 - - - - - -] [instance: be420b9b-7858-47b5-93b5-4c99f93efeed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.540 2 INFO nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Creating config drive at /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.config
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.550 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwvboe2_l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.695 2 DEBUG oslo_concurrency.processutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwvboe2_l" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:30 compute-0 kernel: tap048dc651-e1: entered promiscuous mode
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.7809] manager: (tap048dc651-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Sep 30 21:26:30 compute-0 systemd-udevd[228297]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:30 compute-0 ovn_controller[94912]: 2025-09-30T21:26:30Z|00207|binding|INFO|Claiming lport 048dc651-e105-4eae-b4c6-92b99b32cd2f for this chassis.
Sep 30 21:26:30 compute-0 ovn_controller[94912]: 2025-09-30T21:26:30Z|00208|binding|INFO|048dc651-e105-4eae-b4c6-92b99b32cd2f: Claiming fa:16:3e:fd:f5:8a 10.100.0.9
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.840 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:f5:8a 10.100.0.9'], port_security=['fa:16:3e:fd:f5:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '87d0ed60-e235-45c5-ae23-1e76a970c679', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=048dc651-e105-4eae-b4c6-92b99b32cd2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.841 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 048dc651-e105-4eae-b4c6-92b99b32cd2f in datapath 5569112a-9fb3-4151-add0-95b595cbe309 bound to our chassis
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.842 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:30 compute-0 ovn_controller[94912]: 2025-09-30T21:26:30Z|00209|binding|INFO|Setting lport 048dc651-e105-4eae-b4c6-92b99b32cd2f ovn-installed in OVS
Sep 30 21:26:30 compute-0 ovn_controller[94912]: 2025-09-30T21:26:30Z|00210|binding|INFO|Setting lport 048dc651-e105-4eae-b4c6-92b99b32cd2f up in Southbound
Sep 30 21:26:30 compute-0 nova_compute[192810]: 2025-09-30 21:26:30.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.8483] device (tap048dc651-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.8489] device (tap048dc651-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.855 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[45018cd0-c848-47b3-9a10-3e76891371bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.855 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5569112a-91 in ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:26:30 compute-0 systemd-machined[152794]: New machine qemu-28-instance-0000003b.
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.857 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5569112a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.857 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b596db49-f0c5-48aa-973a-77418da4785f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.858 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0f94b4-b383-494a-b249-072a4300ddec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.869 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8edf41-e4f4-4009-8219-f39b259a05e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 systemd[1]: Started Virtual Machine qemu-28-instance-0000003b.
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.896 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab3769f-432a-4f10-955c-03b803407170]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.930 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[576351f7-3485-4113-9464-e63bfd1316e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.934 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a746dc77-24c2-42de-9a2d-b72ffc286985]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 systemd-udevd[228302]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.9365] manager: (tap5569112a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.966 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[30c4691e-aa84-4827-86a6-83e50d2e0865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:30.968 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d3fbf567-76c0-40b0-9c92-18c708515860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:30 compute-0 NetworkManager[51733]: <info>  [1759267590.9977] device (tap5569112a-90): carrier: link connected
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.004 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[866fb43f-d864-4436-b46d-1c83e74b6e1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.025 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9d27855e-83c1-4119-b25a-04cb70be629f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428662, 'reachable_time': 15811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228333, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.045 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4619852b-290f-4353-98a1-2e1adc213802]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428662, 'tstamp': 428662}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228334, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.063 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a050f9e6-dd6c-4159-a1db-9464f9a83edc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428662, 'reachable_time': 15811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228335, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.100 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[802850c6-637f-4efc-99d6-160662831101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.179 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[558d27dc-3e0f-4477-8b91-968e228951ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.181 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.182 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.182 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569112a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:31 compute-0 kernel: tap5569112a-90: entered promiscuous mode
Sep 30 21:26:31 compute-0 NetworkManager[51733]: <info>  [1759267591.1858] manager: (tap5569112a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.188 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5569112a-90, col_values=(('external_ids', {'iface-id': 'af49dc17-c7c9-4524-8791-14107f2ff34d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:31 compute-0 ovn_controller[94912]: 2025-09-30T21:26:31Z|00211|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.191 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.193 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd78846-8783-40d0-91a3-bf05454b9346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.194 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:26:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:31.196 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'env', 'PROCESS_TAG=haproxy-5569112a-9fb3-4151-add0-95b595cbe309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5569112a-9fb3-4151-add0-95b595cbe309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:31 compute-0 podman[228374]: 2025-09-30 21:26:31.602794098 +0000 UTC m=+0.058484354 container create c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:26:31 compute-0 systemd[1]: Started libpod-conmon-c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b.scope.
Sep 30 21:26:31 compute-0 podman[228374]: 2025-09-30 21:26:31.568047012 +0000 UTC m=+0.023737308 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:26:31 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:26:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a4d060fc8d99706ea207c32df1f02f2b4c37e6be59e52f7b239929457c4b806/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:26:31 compute-0 podman[228374]: 2025-09-30 21:26:31.676622477 +0000 UTC m=+0.132312743 container init c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:26:31 compute-0 podman[228374]: 2025-09-30 21:26:31.683538391 +0000 UTC m=+0.139228647 container start c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:26:31 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [NOTICE]   (228393) : New worker (228395) forked
Sep 30 21:26:31 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [NOTICE]   (228393) : Loading success.
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.719 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267591.7189326, 87d0ed60-e235-45c5-ae23-1e76a970c679 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.719 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] VM Started (Lifecycle Event)
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.736 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.738 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267591.7196915, 87d0ed60-e235-45c5-ae23-1e76a970c679 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.739 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] VM Paused (Lifecycle Event)
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.754 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.756 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.778 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.894 2 DEBUG nova.compute.manager [req-acea5d2a-f544-417f-af42-eee10e97395a req-92cf8d03-ac04-41e3-8c19-7cafac7cc2af dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.895 2 DEBUG oslo_concurrency.lockutils [req-acea5d2a-f544-417f-af42-eee10e97395a req-92cf8d03-ac04-41e3-8c19-7cafac7cc2af dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.896 2 DEBUG oslo_concurrency.lockutils [req-acea5d2a-f544-417f-af42-eee10e97395a req-92cf8d03-ac04-41e3-8c19-7cafac7cc2af dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.896 2 DEBUG oslo_concurrency.lockutils [req-acea5d2a-f544-417f-af42-eee10e97395a req-92cf8d03-ac04-41e3-8c19-7cafac7cc2af dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.897 2 DEBUG nova.compute.manager [req-acea5d2a-f544-417f-af42-eee10e97395a req-92cf8d03-ac04-41e3-8c19-7cafac7cc2af dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Processing event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.898 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.933 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267591.933335, 87d0ed60-e235-45c5-ae23-1e76a970c679 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.934 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] VM Resumed (Lifecycle Event)
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.936 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.941 2 INFO nova.virt.libvirt.driver [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Instance spawned successfully.
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.942 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.954 2 DEBUG nova.network.neutron [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Updated VIF entry in instance network info cache for port 048dc651-e105-4eae-b4c6-92b99b32cd2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.955 2 DEBUG nova.network.neutron [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Updating instance_info_cache with network_info: [{"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.960 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.967 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.969 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.969 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.970 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.970 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.970 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.971 2 DEBUG nova.virt.libvirt.driver [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:31 compute-0 nova_compute[192810]: 2025-09-30 21:26:31.996 2 DEBUG oslo_concurrency.lockutils [req-84f3ce5d-889b-453c-b372-5808b30c880d req-c9bc8204-09c5-4192-ab98-e148807bfdf3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-87d0ed60-e235-45c5-ae23-1e76a970c679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:32 compute-0 nova_compute[192810]: 2025-09-30 21:26:32.003 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:32 compute-0 nova_compute[192810]: 2025-09-30 21:26:32.054 2 INFO nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Took 7.34 seconds to spawn the instance on the hypervisor.
Sep 30 21:26:32 compute-0 nova_compute[192810]: 2025-09-30 21:26:32.055 2 DEBUG nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:32 compute-0 nova_compute[192810]: 2025-09-30 21:26:32.151 2 INFO nova.compute.manager [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Took 8.04 seconds to build instance.
Sep 30 21:26:32 compute-0 nova_compute[192810]: 2025-09-30 21:26:32.181 2 DEBUG oslo_concurrency.lockutils [None req-31a378c9-2476-49a4-8ebd-c6f8dee78efc bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.793 2 DEBUG nova.objects.instance [None req-1b9db8ac-1542-4304-84ba-79df0161a2da bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87d0ed60-e235-45c5-ae23-1e76a970c679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.820 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267593.8195713, 87d0ed60-e235-45c5-ae23-1e76a970c679 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.820 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] VM Paused (Lifecycle Event)
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.838 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.842 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:33 compute-0 nova_compute[192810]: 2025-09-30 21:26:33.872 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:26:34 compute-0 kernel: tap048dc651-e1 (unregistering): left promiscuous mode
Sep 30 21:26:34 compute-0 NetworkManager[51733]: <info>  [1759267594.2722] device (tap048dc651-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:26:34 compute-0 ovn_controller[94912]: 2025-09-30T21:26:34Z|00212|binding|INFO|Releasing lport 048dc651-e105-4eae-b4c6-92b99b32cd2f from this chassis (sb_readonly=0)
Sep 30 21:26:34 compute-0 ovn_controller[94912]: 2025-09-30T21:26:34Z|00213|binding|INFO|Setting lport 048dc651-e105-4eae-b4c6-92b99b32cd2f down in Southbound
Sep 30 21:26:34 compute-0 ovn_controller[94912]: 2025-09-30T21:26:34Z|00214|binding|INFO|Removing iface tap048dc651-e1 ovn-installed in OVS
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.377 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:f5:8a 10.100.0.9'], port_security=['fa:16:3e:fd:f5:8a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '87d0ed60-e235-45c5-ae23-1e76a970c679', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=048dc651-e105-4eae-b4c6-92b99b32cd2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.378 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 048dc651-e105-4eae-b4c6-92b99b32cd2f in datapath 5569112a-9fb3-4151-add0-95b595cbe309 unbound from our chassis
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.379 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5569112a-9fb3-4151-add0-95b595cbe309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.380 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4a98c2c7-e3e4-46a6-b703-3964dbf66579]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.380 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace which is not needed anymore
Sep 30 21:26:34 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003b.scope: Deactivated successfully.
Sep 30 21:26:34 compute-0 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d0000003b.scope: Consumed 2.801s CPU time.
Sep 30 21:26:34 compute-0 systemd-machined[152794]: Machine qemu-28-instance-0000003b terminated.
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.506 2 DEBUG nova.compute.manager [None req-1b9db8ac-1542-4304-84ba-79df0161a2da bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:34 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [NOTICE]   (228393) : haproxy version is 2.8.14-c23fe91
Sep 30 21:26:34 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [NOTICE]   (228393) : path to executable is /usr/sbin/haproxy
Sep 30 21:26:34 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [WARNING]  (228393) : Exiting Master process...
Sep 30 21:26:34 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [ALERT]    (228393) : Current worker (228395) exited with code 143 (Terminated)
Sep 30 21:26:34 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228389]: [WARNING]  (228393) : All workers exited. Exiting... (0)
Sep 30 21:26:34 compute-0 systemd[1]: libpod-c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b.scope: Deactivated successfully.
Sep 30 21:26:34 compute-0 podman[228432]: 2025-09-30 21:26:34.527015129 +0000 UTC m=+0.052566385 container died c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:26:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:26:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a4d060fc8d99706ea207c32df1f02f2b4c37e6be59e52f7b239929457c4b806-merged.mount: Deactivated successfully.
Sep 30 21:26:34 compute-0 podman[228432]: 2025-09-30 21:26:34.567517689 +0000 UTC m=+0.093068925 container cleanup c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:26:34 compute-0 systemd[1]: libpod-conmon-c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b.scope: Deactivated successfully.
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.618 2 DEBUG nova.compute.manager [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.618 2 DEBUG oslo_concurrency.lockutils [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.618 2 DEBUG oslo_concurrency.lockutils [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.619 2 DEBUG oslo_concurrency.lockutils [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.619 2 DEBUG nova.compute.manager [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] No waiting events found dispatching network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.619 2 WARNING nova.compute.manager [req-41f6b595-127f-4137-bfdc-c99b5adb92b8 req-201dc2fe-4cb2-49fb-aefd-be90c74b551f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received unexpected event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f for instance with vm_state suspended and task_state None.
Sep 30 21:26:34 compute-0 podman[228475]: 2025-09-30 21:26:34.64102553 +0000 UTC m=+0.050487932 container remove c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.648 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b16c1da7-11e5-4745-a1d7-ebff67a9ce27]: (4, ('Tue Sep 30 09:26:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b)\nc47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b\nTue Sep 30 09:26:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (c47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b)\nc47cf3019dbb5e3399fb2487b6d192552fa2e453565f4f319755965f3032aa6b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.650 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e5bc345e-53ad-4a6d-9855-d97afdb3d3e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.651 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 kernel: tap5569112a-90: left promiscuous mode
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 nova_compute[192810]: 2025-09-30 21:26:34.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.676 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e7845642-b0b8-42dd-9725-4cbaa872e753]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.719 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[aaea4e15-ff4f-49ef-95cc-207a48d5269b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.720 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bf4eb99f-4add-497c-ac23-35efe5bc067e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.746 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[042a9483-0f54-475e-aab5-624e2f5c1568]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428654, 'reachable_time': 41095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228494, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.749 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:26:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:34.749 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[fd6d866c-e5ff-4a2f-af12-501213cee47b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:34 compute-0 systemd[1]: run-netns-ovnmeta\x2d5569112a\x2d9fb3\x2d4151\x2dadd0\x2d95b595cbe309.mount: Deactivated successfully.
Sep 30 21:26:35 compute-0 nova_compute[192810]: 2025-09-30 21:26:35.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.797 2 DEBUG nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-vif-unplugged-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.797 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.797 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.797 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.798 2 DEBUG nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] No waiting events found dispatching network-vif-unplugged-048dc651-e105-4eae-b4c6-92b99b32cd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.798 2 WARNING nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received unexpected event network-vif-unplugged-048dc651-e105-4eae-b4c6-92b99b32cd2f for instance with vm_state suspended and task_state None.
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.798 2 DEBUG nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.798 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.799 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.799 2 DEBUG oslo_concurrency.lockutils [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.799 2 DEBUG nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] No waiting events found dispatching network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.799 2 WARNING nova.compute.manager [req-dab5f7ad-b50c-4fac-b769-179913a648fc req-2dc044d2-dd3a-487e-83f3-8d08daab16ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received unexpected event network-vif-plugged-048dc651-e105-4eae-b4c6-92b99b32cd2f for instance with vm_state suspended and task_state None.
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.889 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.889 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.890 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.890 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.891 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.908 2 INFO nova.compute.manager [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Terminating instance
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.922 2 DEBUG nova.compute.manager [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.929 2 INFO nova.virt.libvirt.driver [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Instance destroyed successfully.
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.931 2 DEBUG nova.objects.instance [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'resources' on Instance uuid 87d0ed60-e235-45c5-ae23-1e76a970c679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.946 2 DEBUG nova.virt.libvirt.vif [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1016171574',display_name='tempest-DeleteServersTestJSON-server-1016171574',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1016171574',id=59,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:26:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-ps02fatk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:34Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=87d0ed60-e235-45c5-ae23-1e76a970c679,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.947 2 DEBUG nova.network.os_vif_util [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "address": "fa:16:3e:fd:f5:8a", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap048dc651-e1", "ovs_interfaceid": "048dc651-e105-4eae-b4c6-92b99b32cd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.947 2 DEBUG nova.network.os_vif_util [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.948 2 DEBUG os_vif [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.949 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap048dc651-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.954 2 INFO os_vif [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:f5:8a,bridge_name='br-int',has_traffic_filtering=True,id=048dc651-e105-4eae-b4c6-92b99b32cd2f,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap048dc651-e1')
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.955 2 INFO nova.virt.libvirt.driver [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Deleting instance files /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679_del
Sep 30 21:26:36 compute-0 nova_compute[192810]: 2025-09-30 21:26:36.955 2 INFO nova.virt.libvirt.driver [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Deletion of /var/lib/nova/instances/87d0ed60-e235-45c5-ae23-1e76a970c679_del complete
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.039 2 INFO nova.compute.manager [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Took 0.12 seconds to destroy the instance on the hypervisor.
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.040 2 DEBUG oslo.service.loopingcall [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.041 2 DEBUG nova.compute.manager [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.041 2 DEBUG nova.network.neutron [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.926 2 DEBUG nova.network.neutron [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:37 compute-0 nova_compute[192810]: 2025-09-30 21:26:37.945 2 INFO nova.compute.manager [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Took 0.90 seconds to deallocate network for instance.
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.006 2 DEBUG nova.compute.manager [req-6054900d-b6dc-41ed-95e6-a983db2393ae req-c95f8ad6-8b0b-4e77-a9a1-61e98e723797 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Received event network-vif-deleted-048dc651-e105-4eae-b4c6-92b99b32cd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.018 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.018 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.077 2 DEBUG nova.compute.provider_tree [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.096 2 DEBUG nova.scheduler.client.report [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.114 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.134 2 INFO nova.scheduler.client.report [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Deleted allocations for instance 87d0ed60-e235-45c5-ae23-1e76a970c679
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.231 2 DEBUG oslo_concurrency.lockutils [None req-2cf59d88-1fc0-4683-a2be-3b1d032e7dae bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "87d0ed60-e235-45c5-ae23-1e76a970c679" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:38 compute-0 podman[228496]: 2025-09-30 21:26:38.376535713 +0000 UTC m=+0.097105845 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute)
Sep 30 21:26:38 compute-0 podman[228495]: 2025-09-30 21:26:38.403735218 +0000 UTC m=+0.123493930 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:26:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:38.731 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:38.731 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:38.731 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:38 compute-0 nova_compute[192810]: 2025-09-30 21:26:38.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:39 compute-0 nova_compute[192810]: 2025-09-30 21:26:39.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.416 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.417 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.439 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.695 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.696 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.704 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.705 2 INFO nova.compute.claims [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.858 2 DEBUG nova.compute.provider_tree [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.873 2 DEBUG nova.scheduler.client.report [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.895 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.896 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.941 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.942 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.959 2 INFO nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:26:41 compute-0 nova_compute[192810]: 2025-09-30 21:26:41.973 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.065 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.066 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.067 2 INFO nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Creating image(s)
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.067 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.068 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.068 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.079 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.135 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.136 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.137 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.147 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.178 2 DEBUG nova.policy [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bfe43dba9d03417182dd245d360568e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.202 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.203 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.244 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.245 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.246 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.305 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.306 2 DEBUG nova.virt.disk.api [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Checking if we can resize image /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.307 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.360 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.361 2 DEBUG nova.virt.disk.api [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Cannot resize image /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.361 2 DEBUG nova.objects.instance [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'migration_context' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.391 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.392 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Ensure instance console log exists: /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.392 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.392 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.393 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:42 compute-0 nova_compute[192810]: 2025-09-30 21:26:42.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:43 compute-0 nova_compute[192810]: 2025-09-30 21:26:43.064 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Successfully created port: dd77e849-d522-43c2-94ac-03e8228ad770 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:26:43 compute-0 nova_compute[192810]: 2025-09-30 21:26:43.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:43 compute-0 nova_compute[192810]: 2025-09-30 21:26:43.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.905 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.906 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:26:43.907 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:26:43 compute-0 nova_compute[192810]: 2025-09-30 21:26:43.994 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Successfully updated port: dd77e849-d522-43c2-94ac-03e8228ad770 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.010 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.010 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.011 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.100 2 DEBUG nova.compute.manager [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.101 2 DEBUG nova.compute.manager [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing instance network info cache due to event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.101 2 DEBUG oslo_concurrency.lockutils [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:44 compute-0 nova_compute[192810]: 2025-09-30 21:26:44.227 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:26:44 compute-0 podman[228550]: 2025-09-30 21:26:44.318211764 +0000 UTC m=+0.060078704 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.158 2 DEBUG nova.network.neutron [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.185 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.186 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance network_info: |[{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.186 2 DEBUG oslo_concurrency.lockutils [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.186 2 DEBUG nova.network.neutron [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.188 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Start _get_guest_xml network_info=[{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.194 2 WARNING nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.197 2 DEBUG nova.virt.libvirt.host [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.198 2 DEBUG nova.virt.libvirt.host [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.200 2 DEBUG nova.virt.libvirt.host [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.200 2 DEBUG nova.virt.libvirt.host [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.201 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.201 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.202 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.202 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.202 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.202 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.203 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.203 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.203 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.203 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.203 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.204 2 DEBUG nova.virt.hardware [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.207 2 DEBUG nova.virt.libvirt.vif [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:41Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.207 2 DEBUG nova.network.os_vif_util [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.208 2 DEBUG nova.network.os_vif_util [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.208 2 DEBUG nova.objects.instance [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.227 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <uuid>3391dcee-6677-46e1-bda2-82f72ebee7f2</uuid>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <name>instance-0000003d</name>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:name>tempest-DeleteServersTestJSON-server-1431604073</nova:name>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:26:45</nova:creationTime>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:user uuid="bfe43dba9d03417182dd245d360568e6">tempest-DeleteServersTestJSON-314554874-project-member</nova:user>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:project uuid="c4bb94b19ac546f195f1f1f35411cce9">tempest-DeleteServersTestJSON-314554874</nova:project>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         <nova:port uuid="dd77e849-d522-43c2-94ac-03e8228ad770">
Sep 30 21:26:45 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <system>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="serial">3391dcee-6677-46e1-bda2-82f72ebee7f2</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="uuid">3391dcee-6677-46e1-bda2-82f72ebee7f2</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </system>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <os>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </os>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <features>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </features>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:7b:d4:2d"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <target dev="tapdd77e849-d5"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/console.log" append="off"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <video>
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </video>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:26:45 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:26:45 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:26:45 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:26:45 compute-0 nova_compute[192810]: </domain>
Sep 30 21:26:45 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.229 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Preparing to wait for external event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.229 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.229 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.229 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.230 2 DEBUG nova.virt.libvirt.vif [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:26:41Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.230 2 DEBUG nova.network.os_vif_util [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.231 2 DEBUG nova.network.os_vif_util [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.231 2 DEBUG os_vif [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.234 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd77e849-d5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd77e849-d5, col_values=(('external_ids', {'iface-id': 'dd77e849-d522-43c2-94ac-03e8228ad770', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:d4:2d', 'vm-uuid': '3391dcee-6677-46e1-bda2-82f72ebee7f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 NetworkManager[51733]: <info>  [1759267605.2371] manager: (tapdd77e849-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.241 2 INFO os_vif [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5')
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.313 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.314 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.314 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] No VIF found with MAC fa:16:3e:7b:d4:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.314 2 INFO nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Using config drive
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.723 2 INFO nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Creating config drive at /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.728 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv0lbjw_j execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.817 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.817 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.818 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.851 2 DEBUG oslo_concurrency.processutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv0lbjw_j" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:45 compute-0 kernel: tapdd77e849-d5: entered promiscuous mode
Sep 30 21:26:45 compute-0 NetworkManager[51733]: <info>  [1759267605.9053] manager: (tapdd77e849-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Sep 30 21:26:45 compute-0 ovn_controller[94912]: 2025-09-30T21:26:45Z|00215|binding|INFO|Claiming lport dd77e849-d522-43c2-94ac-03e8228ad770 for this chassis.
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 ovn_controller[94912]: 2025-09-30T21:26:45Z|00216|binding|INFO|dd77e849-d522-43c2-94ac-03e8228ad770: Claiming fa:16:3e:7b:d4:2d 10.100.0.7
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.913 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:d4:2d 10.100.0.7'], port_security=['fa:16:3e:7b:d4:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3391dcee-6677-46e1-bda2-82f72ebee7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=dd77e849-d522-43c2-94ac-03e8228ad770) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.915 103867 INFO neutron.agent.ovn.metadata.agent [-] Port dd77e849-d522-43c2-94ac-03e8228ad770 in datapath 5569112a-9fb3-4151-add0-95b595cbe309 bound to our chassis
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.916 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 ovn_controller[94912]: 2025-09-30T21:26:45Z|00217|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 ovn-installed in OVS
Sep 30 21:26:45 compute-0 ovn_controller[94912]: 2025-09-30T21:26:45Z|00218|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 up in Southbound
Sep 30 21:26:45 compute-0 nova_compute[192810]: 2025-09-30 21:26:45.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.927 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[84696c8c-687f-41ed-8f83-9cacbd96eec2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.928 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5569112a-91 in ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:26:45 compute-0 systemd-udevd[228590]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.932 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5569112a-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.932 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf21ff2-c96f-45da-92d6-ae69430411fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.933 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0ff1156e-35ff-48bb-932c-49ca21ad8bf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 NetworkManager[51733]: <info>  [1759267605.9430] device (tapdd77e849-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:26:45 compute-0 NetworkManager[51733]: <info>  [1759267605.9440] device (tapdd77e849-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.942 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7ec94462-b04d-42ad-bb12-fef9c56edd28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 systemd-machined[152794]: New machine qemu-29-instance-0000003d.
Sep 30 21:26:45 compute-0 systemd[1]: Started Virtual Machine qemu-29-instance-0000003d.
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.955 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[22dc7cf1-193f-46b5-8e7e-e137f71ca737]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.980 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[311355c4-5ae8-45e9-bfd8-b083fd654eec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:45.985 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[58c62e64-fe56-4a57-bfb2-ab831bf53826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:45 compute-0 systemd-udevd[228595]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:26:45 compute-0 NetworkManager[51733]: <info>  [1759267605.9861] manager: (tap5569112a-90): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.007 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.013 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e018bcb0-f720-46e7-9b9b-65c1cdcfaccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.018 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[aa16ebb5-cf9d-462b-ba22-a61e60681e9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 NetworkManager[51733]: <info>  [1759267606.0366] device (tap5569112a-90): carrier: link connected
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.041 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[751660c9-84e7-42cd-84f5-278c72231674]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.056 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3a239de9-c42a-4035-839f-07f407f73827]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430166, 'reachable_time': 21630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228625, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.063 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.064 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.070 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e41cff23-797f-46b2-b50e-3194d1035762]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:1ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430166, 'tstamp': 430166}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228627, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.082 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb2fe3d-b50c-4289-bac5-78b40018d55f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5569112a-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:01:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430166, 'reachable_time': 21630, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228630, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.106 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[86db1718-1560-4c6c-ad1e-3c2717359857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.117 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.160 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[98838d26-45ad-4802-b555-3fbb6fcde35f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.161 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.161 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.162 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5569112a-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:46 compute-0 kernel: tap5569112a-90: entered promiscuous mode
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:46 compute-0 NetworkManager[51733]: <info>  [1759267606.1638] manager: (tap5569112a-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.166 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5569112a-90, col_values=(('external_ids', {'iface-id': 'af49dc17-c7c9-4524-8791-14107f2ff34d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:46 compute-0 ovn_controller[94912]: 2025-09-30T21:26:46Z|00219|binding|INFO|Releasing lport af49dc17-c7c9-4524-8791-14107f2ff34d from this chassis (sb_readonly=0)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.180 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.181 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7ed43a0b-0b7e-4acf-9f92-fc6f44910f8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.182 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/5569112a-9fb3-4151-add0-95b595cbe309.pid.haproxy
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 5569112a-9fb3-4151-add0-95b595cbe309
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:26:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:26:46.183 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'env', 'PROCESS_TAG=haproxy-5569112a-9fb3-4151-add0-95b595cbe309', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5569112a-9fb3-4151-add0-95b595cbe309.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.238 2 DEBUG nova.compute.manager [req-aef26668-cc66-4edb-865a-4b774488dfa5 req-04b80b29-658f-4598-848f-1d452f896cc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.239 2 DEBUG oslo_concurrency.lockutils [req-aef26668-cc66-4edb-865a-4b774488dfa5 req-04b80b29-658f-4598-848f-1d452f896cc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.239 2 DEBUG oslo_concurrency.lockutils [req-aef26668-cc66-4edb-865a-4b774488dfa5 req-04b80b29-658f-4598-848f-1d452f896cc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.239 2 DEBUG oslo_concurrency.lockutils [req-aef26668-cc66-4edb-865a-4b774488dfa5 req-04b80b29-658f-4598-848f-1d452f896cc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.240 2 DEBUG nova.compute.manager [req-aef26668-cc66-4edb-865a-4b774488dfa5 req-04b80b29-658f-4598-848f-1d452f896cc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Processing event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.271 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.272 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.35204696655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.272 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.273 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.345 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 3391dcee-6677-46e1-bda2-82f72ebee7f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.346 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.346 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.364 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.388 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.389 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.401 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.435 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.475 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.491 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.512 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.512 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:46 compute-0 podman[228670]: 2025-09-30 21:26:46.528695438 +0000 UTC m=+0.055063658 container create 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:26:46 compute-0 systemd[1]: Started libpod-conmon-77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f.scope.
Sep 30 21:26:46 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:26:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41130f71097117aebf98c0e60e12c474f97d41aad77001c9585480fd2affd1ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:26:46 compute-0 podman[228670]: 2025-09-30 21:26:46.494818374 +0000 UTC m=+0.021186644 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:26:46 compute-0 podman[228670]: 2025-09-30 21:26:46.601674286 +0000 UTC m=+0.128042536 container init 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:26:46 compute-0 podman[228670]: 2025-09-30 21:26:46.608634481 +0000 UTC m=+0.135002701 container start 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:26:46 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [NOTICE]   (228689) : New worker (228691) forked
Sep 30 21:26:46 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [NOTICE]   (228689) : Loading success.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.682 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267606.6822872, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.683 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Started (Lifecycle Event)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.684 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.689 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.691 2 INFO nova.virt.libvirt.driver [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance spawned successfully.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.692 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.711 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.716 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.719 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.720 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.720 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.721 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.721 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.722 2 DEBUG nova.virt.libvirt.driver [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.755 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.756 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267606.682935, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.756 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Paused (Lifecycle Event)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.758 2 DEBUG nova.network.neutron [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updated VIF entry in instance network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.758 2 DEBUG nova.network.neutron [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.787 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.788 2 DEBUG oslo_concurrency.lockutils [req-d839dc74-dac9-4e7e-8079-f5bde6ad247a req-69204d14-820a-4028-9d87-69ca19b81935 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.790 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267606.6863363, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.790 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Resumed (Lifecycle Event)
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.810 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.813 2 INFO nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Took 4.75 seconds to spawn the instance on the hypervisor.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.813 2 DEBUG nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.814 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.849 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.892 2 INFO nova.compute.manager [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Took 5.39 seconds to build instance.
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.908 2 DEBUG oslo_concurrency.lockutils [None req-ab2d9839-5b1d-46a9-9fc0-c995af5f90c8 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:46 compute-0 nova_compute[192810]: 2025-09-30 21:26:46.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:47 compute-0 nova_compute[192810]: 2025-09-30 21:26:47.513 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.394 2 DEBUG nova.compute.manager [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.394 2 DEBUG oslo_concurrency.lockutils [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.394 2 DEBUG oslo_concurrency.lockutils [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.394 2 DEBUG oslo_concurrency.lockutils [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.395 2 DEBUG nova.compute.manager [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.395 2 WARNING nova.compute.manager [req-36ff00f1-8daa-4223-96fd-d70a1b8c4679 req-f83e9485-d6d7-42cf-96ef-2ede48d8741d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state None.
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:26:48 compute-0 nova_compute[192810]: 2025-09-30 21:26:48.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.038 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.038 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.038 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.038 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.508 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267594.5063, 87d0ed60-e235-45c5-ae23-1e76a970c679 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.508 2 INFO nova.compute.manager [-] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] VM Stopped (Lifecycle Event)
Sep 30 21:26:49 compute-0 nova_compute[192810]: 2025-09-30 21:26:49.534 2 DEBUG nova.compute.manager [None req-0a2fcc1d-217d-439e-a38b-4916aa7978e0 - - - - - -] [instance: 87d0ed60-e235-45c5-ae23-1e76a970c679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:26:50 compute-0 nova_compute[192810]: 2025-09-30 21:26:50.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:50 compute-0 nova_compute[192810]: 2025-09-30 21:26:50.570 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:50 compute-0 nova_compute[192810]: 2025-09-30 21:26:50.589 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:50 compute-0 nova_compute[192810]: 2025-09-30 21:26:50.590 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:26:51 compute-0 nova_compute[192810]: 2025-09-30 21:26:51.066 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:26:51 compute-0 nova_compute[192810]: 2025-09-30 21:26:51.067 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:26:51 compute-0 nova_compute[192810]: 2025-09-30 21:26:51.067 2 DEBUG nova.network.neutron [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:26:51 compute-0 podman[228700]: 2025-09-30 21:26:51.324778904 +0000 UTC m=+0.055873298 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:26:51 compute-0 podman[228701]: 2025-09-30 21:26:51.340315705 +0000 UTC m=+0.065853359 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git)
Sep 30 21:26:51 compute-0 nova_compute[192810]: 2025-09-30 21:26:51.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.254 2 DEBUG nova.network.neutron [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.271 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.398 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.399 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Creating file /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/d5a620fb27f84758b720c20d2beb5c5e.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.399 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/d5a620fb27f84758b720c20d2beb5c5e.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.912 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/d5a620fb27f84758b720c20d2beb5c5e.tmp" returned: 1 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.913 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/d5a620fb27f84758b720c20d2beb5c5e.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.914 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Creating directory /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:26:52 compute-0 nova_compute[192810]: 2025-09-30 21:26:52.914 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:26:53 compute-0 nova_compute[192810]: 2025-09-30 21:26:53.121 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:26:53 compute-0 nova_compute[192810]: 2025-09-30 21:26:53.125 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:26:55 compute-0 nova_compute[192810]: 2025-09-30 21:26:55.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:56 compute-0 nova_compute[192810]: 2025-09-30 21:26:56.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:26:58 compute-0 podman[228758]: 2025-09-30 21:26:58.323389583 +0000 UTC m=+0.055673793 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:26:58 compute-0 podman[228759]: 2025-09-30 21:26:58.333276042 +0000 UTC m=+0.061076669 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:26:58 compute-0 podman[228757]: 2025-09-30 21:26:58.345017068 +0000 UTC m=+0.069617725 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd)
Sep 30 21:26:59 compute-0 ovn_controller[94912]: 2025-09-30T21:26:59Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:d4:2d 10.100.0.7
Sep 30 21:26:59 compute-0 ovn_controller[94912]: 2025-09-30T21:26:59Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:d4:2d 10.100.0.7
Sep 30 21:27:00 compute-0 nova_compute[192810]: 2025-09-30 21:27:00.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:01 compute-0 nova_compute[192810]: 2025-09-30 21:27:01.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:03 compute-0 nova_compute[192810]: 2025-09-30 21:27:03.173 2 DEBUG nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 kernel: tapdd77e849-d5 (unregistering): left promiscuous mode
Sep 30 21:27:05 compute-0 NetworkManager[51733]: <info>  [1759267625.3225] device (tapdd77e849-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 ovn_controller[94912]: 2025-09-30T21:27:05Z|00220|binding|INFO|Releasing lport dd77e849-d522-43c2-94ac-03e8228ad770 from this chassis (sb_readonly=0)
Sep 30 21:27:05 compute-0 ovn_controller[94912]: 2025-09-30T21:27:05Z|00221|binding|INFO|Setting lport dd77e849-d522-43c2-94ac-03e8228ad770 down in Southbound
Sep 30 21:27:05 compute-0 ovn_controller[94912]: 2025-09-30T21:27:05Z|00222|binding|INFO|Removing iface tapdd77e849-d5 ovn-installed in OVS
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.343 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:d4:2d 10.100.0.7'], port_security=['fa:16:3e:7b:d4:2d 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '3391dcee-6677-46e1-bda2-82f72ebee7f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5569112a-9fb3-4151-add0-95b595cbe309', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c4bb94b19ac546f195f1f1f35411cce9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0c9a27b-9b95-41a1-8c38-505b25881a53', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea8c6d09-2e51-451b-abc3-a852f19b487a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=dd77e849-d522-43c2-94ac-03e8228ad770) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.346 103867 INFO neutron.agent.ovn.metadata.agent [-] Port dd77e849-d522-43c2-94ac-03e8228ad770 in datapath 5569112a-9fb3-4151-add0-95b595cbe309 unbound from our chassis
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.348 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5569112a-9fb3-4151-add0-95b595cbe309, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.351 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8767cad5-4874-4fe6-9691-bd5a3e4ed9ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.352 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 namespace which is not needed anymore
Sep 30 21:27:05 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003d.scope: Deactivated successfully.
Sep 30 21:27:05 compute-0 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d0000003d.scope: Consumed 12.566s CPU time.
Sep 30 21:27:05 compute-0 systemd-machined[152794]: Machine qemu-29-instance-0000003d terminated.
Sep 30 21:27:05 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [NOTICE]   (228689) : haproxy version is 2.8.14-c23fe91
Sep 30 21:27:05 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [NOTICE]   (228689) : path to executable is /usr/sbin/haproxy
Sep 30 21:27:05 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [WARNING]  (228689) : Exiting Master process...
Sep 30 21:27:05 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [ALERT]    (228689) : Current worker (228691) exited with code 143 (Terminated)
Sep 30 21:27:05 compute-0 neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309[228684]: [WARNING]  (228689) : All workers exited. Exiting... (0)
Sep 30 21:27:05 compute-0 systemd[1]: libpod-77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f.scope: Deactivated successfully.
Sep 30 21:27:05 compute-0 podman[228843]: 2025-09-30 21:27:05.471652231 +0000 UTC m=+0.043794154 container died 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:27:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f-userdata-shm.mount: Deactivated successfully.
Sep 30 21:27:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-41130f71097117aebf98c0e60e12c474f97d41aad77001c9585480fd2affd1ba-merged.mount: Deactivated successfully.
Sep 30 21:27:05 compute-0 podman[228843]: 2025-09-30 21:27:05.527793385 +0000 UTC m=+0.099935298 container cleanup 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:27:05 compute-0 systemd[1]: libpod-conmon-77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f.scope: Deactivated successfully.
Sep 30 21:27:05 compute-0 podman[228874]: 2025-09-30 21:27:05.630267216 +0000 UTC m=+0.079634587 container remove 77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.636 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c224a897-519d-4d9e-a5f4-59574345c7de]: (4, ('Tue Sep 30 09:27:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f)\n77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f\nTue Sep 30 09:27:05 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 (77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f)\n77da00d6d725ccda61263b086b4d3b5c79884f97e203e50a91445c4f9132f21f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.638 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[61b0d69e-d77f-45dc-a0f8-1b7d230697f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.639 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5569112a-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 kernel: tap5569112a-90: left promiscuous mode
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.658 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[48032146-c9f8-49b3-9cff-f6daaee49a2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.685 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[54d5e94e-9d88-4652-9df9-5fdd2d3575a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.686 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c69f0fdb-51e8-43a5-b177-8d0a34962e3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.700 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[53af147a-801a-41ae-80de-495f65f20013]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430159, 'reachable_time': 36461, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228910, 'error': None, 'target': 'ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.704 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5569112a-9fb3-4151-add0-95b595cbe309 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:27:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d5569112a\x2d9fb3\x2d4151\x2dadd0\x2d95b595cbe309.mount: Deactivated successfully.
Sep 30 21:27:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:05.704 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1454c2-d436-4496-9055-f569fa6fb06b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.748 2 DEBUG nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.749 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.749 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.749 2 DEBUG oslo_concurrency.lockutils [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.749 2 DEBUG nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:05 compute-0 nova_compute[192810]: 2025-09-30 21:27:05.749 2 WARNING nova.compute.manager [req-62d626e8-e189-44be-86a2-e0a5eb8b28f1 req-a9949e92-287c-49d1-8f73-1a45037a8a9e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-unplugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.187 2 INFO nova.virt.libvirt.driver [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance shutdown successfully after 13 seconds.
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.196 2 INFO nova.virt.libvirt.driver [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Instance destroyed successfully.
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.197 2 DEBUG nova.virt.libvirt.vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:26:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:26:50Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.197 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-250418448-network", "vif_mac": "fa:16:3e:7b:d4:2d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.198 2 DEBUG nova.network.os_vif_util [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.199 2 DEBUG os_vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd77e849-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.207 2 INFO os_vif [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5')
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.211 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.269 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.271 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:06 compute-0 sshd-session[228911]: Invalid user cs from 45.81.23.80 port 47492
Sep 30 21:27:06 compute-0 sshd-session[228911]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:27:06 compute-0 sshd-session[228911]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.327 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.329 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Copying file /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk to 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:27:06 compute-0 nova_compute[192810]: 2025-09-30 21:27:06.329 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.179 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "scp -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk" returned: 0 in 0.850s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.179 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Copying file /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.180 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.403 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "scp -C -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.config" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.404 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Copying file /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.404 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.616 2 DEBUG oslo_concurrency.processutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] CMD "scp -C -r /var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/3391dcee-6677-46e1-bda2-82f72ebee7f2/disk.info" returned: 0 in 0.212s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.804 2 DEBUG neutronclient.v2_0.client [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port dd77e849-d522-43c2-94ac-03e8228ad770 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.854 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.855 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.855 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.855 2 DEBUG oslo_concurrency.lockutils [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.855 2 DEBUG nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.855 2 WARNING nova.compute.manager [req-d261855a-331b-4134-913d-da018acfaa3d req-a6646880-6ffd-4825-8ee0-7dd268c0e4e8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.924 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.924 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:07 compute-0 nova_compute[192810]: 2025-09-30 21:27:07.924 2 DEBUG oslo_concurrency.lockutils [None req-b9268017-95c7-4de1-b562-9eae35245d59 bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:08 compute-0 sshd-session[228911]: Failed password for invalid user cs from 45.81.23.80 port 47492 ssh2
Sep 30 21:27:09 compute-0 podman[228926]: 2025-09-30 21:27:09.332284147 +0000 UTC m=+0.061700185 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:27:09 compute-0 podman[228925]: 2025-09-30 21:27:09.370777766 +0000 UTC m=+0.101037625 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 21:27:09 compute-0 nova_compute[192810]: 2025-09-30 21:27:09.936 2 DEBUG nova.compute.manager [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:09 compute-0 nova_compute[192810]: 2025-09-30 21:27:09.936 2 DEBUG nova.compute.manager [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing instance network info cache due to event network-changed-dd77e849-d522-43c2-94ac-03e8228ad770. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:27:09 compute-0 nova_compute[192810]: 2025-09-30 21:27:09.936 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:09 compute-0 nova_compute[192810]: 2025-09-30 21:27:09.936 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:09 compute-0 nova_compute[192810]: 2025-09-30 21:27:09.936 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Refreshing network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:27:10 compute-0 sshd-session[228911]: Received disconnect from 45.81.23.80 port 47492:11: Bye Bye [preauth]
Sep 30 21:27:10 compute-0 sshd-session[228911]: Disconnected from invalid user cs 45.81.23.80 port 47492 [preauth]
Sep 30 21:27:11 compute-0 nova_compute[192810]: 2025-09-30 21:27:11.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:11 compute-0 nova_compute[192810]: 2025-09-30 21:27:11.442 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updated VIF entry in instance network info cache for port dd77e849-d522-43c2-94ac-03e8228ad770. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:11 compute-0 nova_compute[192810]: 2025-09-30 21:27:11.443 2 DEBUG nova.network.neutron [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:11 compute-0 nova_compute[192810]: 2025-09-30 21:27:11.457 2 DEBUG oslo_concurrency.lockutils [req-b0528513-5afb-432f-a0ef-c07fa4e62ffa req-8d2c8b8f-2b4f-451c-bf6d-4dc34948e113 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.055 2 DEBUG nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.055 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.055 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.056 2 DEBUG oslo_concurrency.lockutils [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.056 2 DEBUG nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:12 compute-0 nova_compute[192810]: 2025-09-30 21:27:12.056 2 WARNING nova.compute.manager [req-12483452-e8b8-4a99-a187-908c616a0ad3 req-0b38f5a4-e813-423d-b968-96f59947a324 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state active and task_state resize_finish.
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.218 2 DEBUG nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.218 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.219 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.219 2 DEBUG oslo_concurrency.lockutils [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.219 2 DEBUG nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] No waiting events found dispatching network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.219 2 WARNING nova.compute.manager [req-5a7d87ee-93bc-4d1e-8378-0ce27e16fe1d req-8e9b559f-bb5d-4a21-a7a9-f65868b929c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Received unexpected event network-vif-plugged-dd77e849-d522-43c2-94ac-03e8228ad770 for instance with vm_state resized and task_state None.
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.792 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.793 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.793 2 DEBUG nova.compute.manager [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Going to confirm migration 11 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:27:14 compute-0 nova_compute[192810]: 2025-09-30 21:27:14.837 2 DEBUG nova.objects.instance [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'info_cache' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:15 compute-0 nova_compute[192810]: 2025-09-30 21:27:15.275 2 DEBUG neutronclient.v2_0.client [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port dd77e849-d522-43c2-94ac-03e8228ad770 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:27:15 compute-0 nova_compute[192810]: 2025-09-30 21:27:15.275 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:15 compute-0 nova_compute[192810]: 2025-09-30 21:27:15.276 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquired lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:15 compute-0 nova_compute[192810]: 2025-09-30 21:27:15.276 2 DEBUG nova.network.neutron [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:27:15 compute-0 podman[228972]: 2025-09-30 21:27:15.330397507 +0000 UTC m=+0.067826339 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923)
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.606 2 DEBUG nova.network.neutron [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Updating instance_info_cache with network_info: [{"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.632 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Releasing lock "refresh_cache-3391dcee-6677-46e1-bda2-82f72ebee7f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.633 2 DEBUG nova.objects.instance [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lazy-loading 'migration_context' on Instance uuid 3391dcee-6677-46e1-bda2-82f72ebee7f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.673 2 DEBUG nova.virt.libvirt.vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:26:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1431604073',display_name='tempest-DeleteServersTestJSON-server-1431604073',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1431604073',id=61,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:27:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c4bb94b19ac546f195f1f1f35411cce9',ramdisk_id='',reservation_id='r-fu0pukiu',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-314554874',owner_user_name='tempest-DeleteServersTestJSON-314554874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:14Z,user_data=None,user_id='bfe43dba9d03417182dd245d360568e6',uuid=3391dcee-6677-46e1-bda2-82f72ebee7f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.673 2 DEBUG nova.network.os_vif_util [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converting VIF {"id": "dd77e849-d522-43c2-94ac-03e8228ad770", "address": "fa:16:3e:7b:d4:2d", "network": {"id": "5569112a-9fb3-4151-add0-95b595cbe309", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-250418448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c4bb94b19ac546f195f1f1f35411cce9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd77e849-d5", "ovs_interfaceid": "dd77e849-d522-43c2-94ac-03e8228ad770", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.677 2 DEBUG nova.network.os_vif_util [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.678 2 DEBUG os_vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.681 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd77e849-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.681 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.684 2 INFO os_vif [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:d4:2d,bridge_name='br-int',has_traffic_filtering=True,id=dd77e849-d522-43c2-94ac-03e8228ad770,network=Network(5569112a-9fb3-4151-add0-95b595cbe309),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd77e849-d5')
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.685 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.685 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.765 2 DEBUG nova.compute.provider_tree [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.782 2 DEBUG nova.scheduler.client.report [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.847 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:16 compute-0 nova_compute[192810]: 2025-09-30 21:27:16.994 2 INFO nova.scheduler.client.report [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Deleted allocation for migration aa12879e-c6a8-4b86-8b41-bda11bc7ed2a
Sep 30 21:27:17 compute-0 nova_compute[192810]: 2025-09-30 21:27:17.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:17 compute-0 nova_compute[192810]: 2025-09-30 21:27:17.087 2 DEBUG oslo_concurrency.lockutils [None req-847d8ebd-f55b-4e7d-ab61-95bc24bc3def bfe43dba9d03417182dd245d360568e6 c4bb94b19ac546f195f1f1f35411cce9 - - default default] Lock "3391dcee-6677-46e1-bda2-82f72ebee7f2" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.586 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267625.5852172, 3391dcee-6677-46e1-bda2-82f72ebee7f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.587 2 INFO nova.compute.manager [-] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] VM Stopped (Lifecycle Event)
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.612 2 DEBUG nova.compute.manager [None req-6dd03629-0138-45ff-960f-6d24811597a2 - - - - - -] [instance: 3391dcee-6677-46e1-bda2-82f72ebee7f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.940 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.941 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:20 compute-0 nova_compute[192810]: 2025-09-30 21:27:20.965 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.112 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.113 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.119 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.119 2 INFO nova.compute.claims [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.308 2 DEBUG nova.compute.provider_tree [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.330 2 DEBUG nova.scheduler.client.report [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.353 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.354 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.419 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.419 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.444 2 INFO nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.463 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.620 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.622 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.622 2 INFO nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Creating image(s)
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.623 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.623 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.624 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.624 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.624 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:21 compute-0 nova_compute[192810]: 2025-09-30 21:27:21.629 2 DEBUG nova.policy [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3fad9c6706e641d680597333f7ac5955', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05e2ec21372f4b1eaa7a1b74ff5f8fdc', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:22 compute-0 podman[228991]: 2025-09-30 21:27:22.329259054 +0000 UTC m=+0.061401978 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:27:22 compute-0 podman[228990]: 2025-09-30 21:27:22.34577552 +0000 UTC m=+0.081653008 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.534 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Successfully created port: 2beca347-4057-4f9c-acfc-695b632cf134 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.777 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.831 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.part --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.832 2 DEBUG nova.virt.images [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] 0189ab7b-07cc-4c2c-8163-82553d5a429b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.833 2 DEBUG nova.privsep.utils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.834 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.part /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.968 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.part /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.converted" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:22 compute-0 nova_compute[192810]: 2025-09-30 21:27:22.972 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.025 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02.converted --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.026 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.037 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.088 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.089 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.090 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.100 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.152 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.154 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02,backing_fmt=raw /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.182 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02,backing_fmt=raw /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.183 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "a9ab869ec5a6bd299955a400c40ca9514f1f8c02" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.184 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.237 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.238 2 DEBUG nova.objects.instance [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lazy-loading 'migration_context' on Instance uuid b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.252 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.252 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Ensure instance console log exists: /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.253 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.253 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.253 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.739 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Successfully updated port: 2beca347-4057-4f9c-acfc-695b632cf134 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.757 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.757 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquired lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.757 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:27:23 compute-0 nova_compute[192810]: 2025-09-30 21:27:23.985 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:27:25 compute-0 nova_compute[192810]: 2025-09-30 21:27:25.594 2 DEBUG nova.compute.manager [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-changed-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:25 compute-0 nova_compute[192810]: 2025-09-30 21:27:25.594 2 DEBUG nova.compute.manager [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Refreshing instance network info cache due to event network-changed-2beca347-4057-4f9c-acfc-695b632cf134. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:27:25 compute-0 nova_compute[192810]: 2025-09-30 21:27:25.595 2 DEBUG oslo_concurrency.lockutils [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.116 2 DEBUG nova.network.neutron [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Updating instance_info_cache with network_info: [{"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.136 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Releasing lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.137 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Instance network_info: |[{"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.137 2 DEBUG oslo_concurrency.lockutils [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.138 2 DEBUG nova.network.neutron [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Refreshing network info cache for port 2beca347-4057-4f9c-acfc-695b632cf134 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.140 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Start _get_guest_xml network_info=[{"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-09-30T21:27:12Z,direct_url=<?>,disk_format='qcow2',id=0189ab7b-07cc-4c2c-8163-82553d5a429b,min_disk=1,min_ram=0,name='tempest-test-snap-1895269994',owner='05e2ec21372f4b1eaa7a1b74ff5f8fdc',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-09-30T21:27:15Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '0189ab7b-07cc-4c2c-8163-82553d5a429b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.144 2 WARNING nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.150 2 DEBUG nova.virt.libvirt.host [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.151 2 DEBUG nova.virt.libvirt.host [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.157 2 DEBUG nova.virt.libvirt.host [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.157 2 DEBUG nova.virt.libvirt.host [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.158 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.159 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='8585e91edf8d37af388395b4946b59f4',container_format='bare',created_at=2025-09-30T21:27:12Z,direct_url=<?>,disk_format='qcow2',id=0189ab7b-07cc-4c2c-8163-82553d5a429b,min_disk=1,min_ram=0,name='tempest-test-snap-1895269994',owner='05e2ec21372f4b1eaa7a1b74ff5f8fdc',properties=ImageMetaProps,protected=<?>,size=23330816,status='active',tags=<?>,updated_at=2025-09-30T21:27:15Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.159 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.159 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.159 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.160 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.160 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.160 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.160 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.160 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.161 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.161 2 DEBUG nova.virt.hardware [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.164 2 DEBUG nova.virt.libvirt.vif [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:27:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2088105630',display_name='tempest-ImagesTestJSON-server-2088105630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2088105630',id=65,image_ref='0189ab7b-07cc-4c2c-8163-82553d5a429b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',ramdisk_id='',reservation_id='r-y4ychanc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='b20e658b-d136-42b9-ab5e-618c3dffbbec',image_min_disk='1',image_min_ram='0',image_owner_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',image_owner_project_name='tempest-ImagesTestJSON-1876097706',image_owner_user_name='tempest-ImagesTestJSON-1876097706-project-member',image_user_id='3fad9c6706e641d680597333f7ac5955',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1876097706',owner_user_name='tempest-ImagesTestJSON-1876097706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:27:21Z,user_data=None,user_id='3fad9c6706e641d680597333f7ac5955',uuid=b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.165 2 DEBUG nova.network.os_vif_util [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converting VIF {"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.165 2 DEBUG nova.network.os_vif_util [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.166 2 DEBUG nova.objects.instance [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lazy-loading 'pci_devices' on Instance uuid b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.180 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <uuid>b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1</uuid>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <name>instance-00000041</name>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:name>tempest-ImagesTestJSON-server-2088105630</nova:name>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:27:26</nova:creationTime>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:user uuid="3fad9c6706e641d680597333f7ac5955">tempest-ImagesTestJSON-1876097706-project-member</nova:user>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:project uuid="05e2ec21372f4b1eaa7a1b74ff5f8fdc">tempest-ImagesTestJSON-1876097706</nova:project>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="0189ab7b-07cc-4c2c-8163-82553d5a429b"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         <nova:port uuid="2beca347-4057-4f9c-acfc-695b632cf134">
Sep 30 21:27:26 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <system>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="serial">b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="uuid">b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </system>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <os>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </os>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <features>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </features>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.config"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:81:98:f3"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <target dev="tap2beca347-40"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/console.log" append="off"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <video>
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </video>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:27:26 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:27:26 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:27:26 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:27:26 compute-0 nova_compute[192810]: </domain>
Sep 30 21:27:26 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.181 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Preparing to wait for external event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.181 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.182 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.182 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.182 2 DEBUG nova.virt.libvirt.vif [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:27:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2088105630',display_name='tempest-ImagesTestJSON-server-2088105630',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2088105630',id=65,image_ref='0189ab7b-07cc-4c2c-8163-82553d5a429b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',ramdisk_id='',reservation_id='r-y4ychanc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='b20e658b-d136-42b9-ab5e-618c3dffbbec',image_min_disk='1',image_min_ram='0',image_owner_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',image_owner_project_name='tempest-ImagesTestJSON-1876097706',image_owner_user_name='tempest-ImagesTestJSON-1876097706-project-member',image_user_id='3fad9c6706e641d680597333f7ac5955',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1876097706',owner_user_name='tempest-ImagesTestJSON-1876097706-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:27:21Z,user_data=None,user_id='3fad9c6706e641d680597333f7ac5955',uuid=b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.183 2 DEBUG nova.network.os_vif_util [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converting VIF {"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.184 2 DEBUG nova.network.os_vif_util [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.184 2 DEBUG os_vif [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.185 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2beca347-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2beca347-40, col_values=(('external_ids', {'iface-id': '2beca347-4057-4f9c-acfc-695b632cf134', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:98:f3', 'vm-uuid': 'b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 NetworkManager[51733]: <info>  [1759267646.1923] manager: (tap2beca347-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.198 2 INFO os_vif [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40')
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.291 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.292 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.371 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.371 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.371 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] No VIF found with MAC fa:16:3e:81:98:f3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.372 2 INFO nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Using config drive
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.785 2 INFO nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Creating config drive at /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.config
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.790 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplkeacaqu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.915 2 DEBUG oslo_concurrency.processutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplkeacaqu" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:27:26 compute-0 kernel: tap2beca347-40: entered promiscuous mode
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 NetworkManager[51733]: <info>  [1759267646.9643] manager: (tap2beca347-40): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Sep 30 21:27:26 compute-0 ovn_controller[94912]: 2025-09-30T21:27:26Z|00223|binding|INFO|Claiming lport 2beca347-4057-4f9c-acfc-695b632cf134 for this chassis.
Sep 30 21:27:26 compute-0 ovn_controller[94912]: 2025-09-30T21:27:26Z|00224|binding|INFO|2beca347-4057-4f9c-acfc-695b632cf134: Claiming fa:16:3e:81:98:f3 10.100.0.10
Sep 30 21:27:26 compute-0 nova_compute[192810]: 2025-09-30 21:27:26.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.976 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:98:f3 10.100.0.10'], port_security=['fa:16:3e:81:98:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08f005d2-033f-4944-bdc2-a0b577ce731c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05e2ec21372f4b1eaa7a1b74ff5f8fdc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9e87fc14-353c-4b88-ab62-0a2c1cd416f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f15e30d-dd35-48c7-8d50-71cdc00dcefd, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2beca347-4057-4f9c-acfc-695b632cf134) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.977 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2beca347-4057-4f9c-acfc-695b632cf134 in datapath 08f005d2-033f-4944-bdc2-a0b577ce731c bound to our chassis
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.978 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08f005d2-033f-4944-bdc2-a0b577ce731c
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.989 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[41583ad5-d0a6-4836-9ea7-4ed0a12b201d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.989 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08f005d2-01 in ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:27:26 compute-0 systemd-udevd[229076]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.991 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08f005d2-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.991 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[315a1660-1f29-463f-bb40-794c7a78d9a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:26.992 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[81338fed-e51b-4356-9e38-b169af2dca09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.002 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1e0da925-2b2d-4f40-8090-07d6d7edbdef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 NetworkManager[51733]: <info>  [1759267647.0042] device (tap2beca347-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:27:27 compute-0 NetworkManager[51733]: <info>  [1759267647.0052] device (tap2beca347-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:27:27 compute-0 systemd-machined[152794]: New machine qemu-30-instance-00000041.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 ovn_controller[94912]: 2025-09-30T21:27:27Z|00225|binding|INFO|Setting lport 2beca347-4057-4f9c-acfc-695b632cf134 ovn-installed in OVS
Sep 30 21:27:27 compute-0 ovn_controller[94912]: 2025-09-30T21:27:27Z|00226|binding|INFO|Setting lport 2beca347-4057-4f9c-acfc-695b632cf134 up in Southbound
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.034 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1e33e1f3-330f-44d2-91a9-8d1cb8362e13]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 systemd[1]: Started Virtual Machine qemu-30-instance-00000041.
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.062 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[af2d5d92-6e86-432b-83d9-e24ea526fe78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 NetworkManager[51733]: <info>  [1759267647.0672] manager: (tap08f005d2-00): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.066 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[214d6a63-aba9-4188-86b4-cc8da25f4191]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 systemd-udevd[229081]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.099 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9b561ef6-511e-486d-94d0-34c9057dff59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.102 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[05685631-bb83-42ed-ac66-5b02b1453f19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 NetworkManager[51733]: <info>  [1759267647.1222] device (tap08f005d2-00): carrier: link connected
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.127 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4925f874-ae18-4d26-9991-4dc1b123dbf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.143 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a213d789-7ecd-4cf6-aea1-26cdc5ddf338]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08f005d2-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:6e:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434274, 'reachable_time': 17262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229111, 'error': None, 'target': 'ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.161 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[293072e5-cd00-436e-b2f6-4894230646e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:6e73'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434274, 'tstamp': 434274}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229112, 'error': None, 'target': 'ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.176 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffed97e-0cf5-42ac-a17b-89c6087a8008]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08f005d2-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:6e:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434274, 'reachable_time': 17262, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229113, 'error': None, 'target': 'ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.205 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d197c9b-6f56-4f37-bf34-925f3ce722f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.264 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d364d8bb-239f-4deb-86ff-89143581f39b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.266 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08f005d2-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.266 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.267 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08f005d2-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:27 compute-0 NetworkManager[51733]: <info>  [1759267647.2697] manager: (tap08f005d2-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 kernel: tap08f005d2-00: entered promiscuous mode
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.329 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08f005d2-00, col_values=(('external_ids', {'iface-id': 'd8533c1a-185d-4211-a117-18b1e5bc8e34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 ovn_controller[94912]: 2025-09-30T21:27:27Z|00227|binding|INFO|Releasing lport d8533c1a-185d-4211-a117-18b1e5bc8e34 from this chassis (sb_readonly=0)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.335 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08f005d2-033f-4944-bdc2-a0b577ce731c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08f005d2-033f-4944-bdc2-a0b577ce731c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.336 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d85e1dc7-c440-4eae-bfd3-d020a3cb7730]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.339 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-08f005d2-033f-4944-bdc2-a0b577ce731c
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/08f005d2-033f-4944-bdc2-a0b577ce731c.pid.haproxy
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 08f005d2-033f-4944-bdc2-a0b577ce731c
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:27:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:27.341 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c', 'env', 'PROCESS_TAG=haproxy-08f005d2-033f-4944-bdc2-a0b577ce731c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08f005d2-033f-4944-bdc2-a0b577ce731c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.707 2 DEBUG nova.network.neutron [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Updated VIF entry in instance network info cache for port 2beca347-4057-4f9c-acfc-695b632cf134. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.708 2 DEBUG nova.network.neutron [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Updating instance_info_cache with network_info: [{"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.725 2 DEBUG oslo_concurrency.lockutils [req-b36afdac-e79c-455a-8c82-c277625ce72d req-f10e79cd-476f-4264-913b-f3960fd0ecd4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:27:27 compute-0 podman[229152]: 2025-09-30 21:27:27.740858423 +0000 UTC m=+0.088988922 container create 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:27:27 compute-0 podman[229152]: 2025-09-30 21:27:27.700402753 +0000 UTC m=+0.048533282 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:27:27 compute-0 systemd[1]: Started libpod-conmon-11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0.scope.
Sep 30 21:27:27 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.790 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267647.7900047, b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.790 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] VM Started (Lifecycle Event)
Sep 30 21:27:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27178727ef424c03a6361a00b37bb00dc2c84d93ab7ff0549c951603bdb75dbc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.794 2 DEBUG nova.compute.manager [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.795 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.795 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.795 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.795 2 DEBUG nova.compute.manager [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Processing event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.795 2 DEBUG nova.compute.manager [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.796 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.796 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.796 2 DEBUG oslo_concurrency.lockutils [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.797 2 DEBUG nova.compute.manager [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] No waiting events found dispatching network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.797 2 WARNING nova.compute.manager [req-cb152431-c313-479f-b70c-83acf8c5cde3 req-847b7194-98b5-4d27-a837-cb7a5b4b8476 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received unexpected event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 for instance with vm_state building and task_state spawning.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.798 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.802 2 DEBUG nova.virt.libvirt.driver [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:27:27 compute-0 podman[229152]: 2025-09-30 21:27:27.802966347 +0000 UTC m=+0.151096876 container init 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.805 2 INFO nova.virt.libvirt.driver [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Instance spawned successfully.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.805 2 INFO nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Took 6.18 seconds to spawn the instance on the hypervisor.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.806 2 DEBUG nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:27 compute-0 podman[229152]: 2025-09-30 21:27:27.808843905 +0000 UTC m=+0.156974404 container start 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.814 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.818 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:27 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [NOTICE]   (229172) : New worker (229174) forked
Sep 30 21:27:27 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [NOTICE]   (229172) : Loading success.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.854 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.855 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267647.7901957, b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.855 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] VM Paused (Lifecycle Event)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.883 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.888 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267647.8005862, b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.889 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] VM Resumed (Lifecycle Event)
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.896 2 INFO nova.compute.manager [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Took 6.83 seconds to build instance.
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.917 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.920 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:27:27 compute-0 nova_compute[192810]: 2025-09-30 21:27:27.922 2 DEBUG oslo_concurrency.lockutils [None req-4c2cf9a3-4283-4423-a95a-8b7f4f99a35d 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.981s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:29 compute-0 podman[229185]: 2025-09-30 21:27:29.326289564 +0000 UTC m=+0.057095839 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:27:29 compute-0 podman[229183]: 2025-09-30 21:27:29.33012077 +0000 UTC m=+0.065015378 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:29 compute-0 podman[229184]: 2025-09-30 21:27:29.356861524 +0000 UTC m=+0.090302616 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.316 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.317 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.317 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.317 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.317 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.329 2 INFO nova.compute.manager [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Terminating instance
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.341 2 DEBUG nova.compute.manager [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:27:30 compute-0 kernel: tap2beca347-40 (unregistering): left promiscuous mode
Sep 30 21:27:30 compute-0 NetworkManager[51733]: <info>  [1759267650.3650] device (tap2beca347-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:27:30 compute-0 ovn_controller[94912]: 2025-09-30T21:27:30Z|00228|binding|INFO|Releasing lport 2beca347-4057-4f9c-acfc-695b632cf134 from this chassis (sb_readonly=0)
Sep 30 21:27:30 compute-0 ovn_controller[94912]: 2025-09-30T21:27:30Z|00229|binding|INFO|Setting lport 2beca347-4057-4f9c-acfc-695b632cf134 down in Southbound
Sep 30 21:27:30 compute-0 ovn_controller[94912]: 2025-09-30T21:27:30Z|00230|binding|INFO|Removing iface tap2beca347-40 ovn-installed in OVS
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.379 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:98:f3 10.100.0.10'], port_security=['fa:16:3e:81:98:f3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08f005d2-033f-4944-bdc2-a0b577ce731c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05e2ec21372f4b1eaa7a1b74ff5f8fdc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9e87fc14-353c-4b88-ab62-0a2c1cd416f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3f15e30d-dd35-48c7-8d50-71cdc00dcefd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2beca347-4057-4f9c-acfc-695b632cf134) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.380 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2beca347-4057-4f9c-acfc-695b632cf134 in datapath 08f005d2-033f-4944-bdc2-a0b577ce731c unbound from our chassis
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.381 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08f005d2-033f-4944-bdc2-a0b577ce731c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.382 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f321f94d-60e9-42cf-bd6d-5c3398288269]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.382 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c namespace which is not needed anymore
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000041.scope: Deactivated successfully.
Sep 30 21:27:30 compute-0 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000041.scope: Consumed 3.205s CPU time.
Sep 30 21:27:30 compute-0 systemd-machined[152794]: Machine qemu-30-instance-00000041 terminated.
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [NOTICE]   (229172) : haproxy version is 2.8.14-c23fe91
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [NOTICE]   (229172) : path to executable is /usr/sbin/haproxy
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [WARNING]  (229172) : Exiting Master process...
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [WARNING]  (229172) : Exiting Master process...
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [ALERT]    (229172) : Current worker (229174) exited with code 143 (Terminated)
Sep 30 21:27:30 compute-0 neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c[229168]: [WARNING]  (229172) : All workers exited. Exiting... (0)
Sep 30 21:27:30 compute-0 systemd[1]: libpod-11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0.scope: Deactivated successfully.
Sep 30 21:27:30 compute-0 podman[229265]: 2025-09-30 21:27:30.511427483 +0000 UTC m=+0.042356577 container died 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:27:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0-userdata-shm.mount: Deactivated successfully.
Sep 30 21:27:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-27178727ef424c03a6361a00b37bb00dc2c84d93ab7ff0549c951603bdb75dbc-merged.mount: Deactivated successfully.
Sep 30 21:27:30 compute-0 podman[229265]: 2025-09-30 21:27:30.548785584 +0000 UTC m=+0.079714668 container cleanup 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:27:30 compute-0 systemd[1]: libpod-conmon-11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0.scope: Deactivated successfully.
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.593 2 INFO nova.virt.libvirt.driver [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Instance destroyed successfully.
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.594 2 DEBUG nova.objects.instance [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lazy-loading 'resources' on Instance uuid b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.609 2 DEBUG nova.virt.libvirt.vif [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:27:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-2088105630',display_name='tempest-ImagesTestJSON-server-2088105630',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-imagestestjson-server-2088105630',id=65,image_ref='0189ab7b-07cc-4c2c-8163-82553d5a429b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:27:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',ramdisk_id='',reservation_id='r-y4ychanc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='b20e658b-d136-42b9-ab5e-618c3dffbbec',image_min_disk='1',image_min_ram='0',image_owner_id='05e2ec21372f4b1eaa7a1b74ff5f8fdc',image_owner_project_name='tempest-ImagesTestJSON-1876097706',image_owner_user_name='tempest-ImagesTestJSON-1876097706-project-member',image_user_id='3fad9c6706e641d680597333f7ac5955',owner_project_name='tempest-ImagesTestJSON-1876097706',owner_user_name='tempest-ImagesTestJSON-1876097706-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:27:27Z,user_data=None,user_id='3fad9c6706e641d680597333f7ac5955',uuid=b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.609 2 DEBUG nova.network.os_vif_util [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converting VIF {"id": "2beca347-4057-4f9c-acfc-695b632cf134", "address": "fa:16:3e:81:98:f3", "network": {"id": "08f005d2-033f-4944-bdc2-a0b577ce731c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-710412215-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05e2ec21372f4b1eaa7a1b74ff5f8fdc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2beca347-40", "ovs_interfaceid": "2beca347-4057-4f9c-acfc-695b632cf134", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.610 2 DEBUG nova.network.os_vif_util [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.610 2 DEBUG os_vif [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2beca347-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 podman[229298]: 2025-09-30 21:27:30.614703695 +0000 UTC m=+0.040921782 container remove 11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.616 2 INFO os_vif [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:98:f3,bridge_name='br-int',has_traffic_filtering=True,id=2beca347-4057-4f9c-acfc-695b632cf134,network=Network(08f005d2-033f-4944-bdc2-a0b577ce731c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2beca347-40')
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.617 2 INFO nova.virt.libvirt.driver [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Deleting instance files /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1_del
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.617 2 INFO nova.virt.libvirt.driver [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Deletion of /var/lib/nova/instances/b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1_del complete
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.619 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9009b4-625a-45ab-b2b3-9fca49ebecce]: (4, ('Tue Sep 30 09:27:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c (11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0)\n11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0\nTue Sep 30 09:27:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c (11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0)\n11ecae22cdd794568c090aa4c4fbd8812c3d8bad0d3850af0cadff69fb7567e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.620 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc15eb3-f780-4e57-93dc-b9f21f7a18cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.621 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08f005d2-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 kernel: tap08f005d2-00: left promiscuous mode
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.636 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[48be5d54-8b27-4dfc-aeb3-5f8811df3d81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.647 2 DEBUG nova.compute.manager [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-unplugged-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.647 2 DEBUG oslo_concurrency.lockutils [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.648 2 DEBUG oslo_concurrency.lockutils [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.648 2 DEBUG oslo_concurrency.lockutils [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.648 2 DEBUG nova.compute.manager [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] No waiting events found dispatching network-vif-unplugged-2beca347-4057-4f9c-acfc-695b632cf134 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.648 2 DEBUG nova.compute.manager [req-c51b05f6-87e6-4eb7-9e72-634d85289e9a req-61c8e4bb-baab-48d6-8551-115e39e037fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-unplugged-2beca347-4057-4f9c-acfc-695b632cf134 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.668 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[27baa741-eb7d-42f9-b2af-2e906657f352]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.669 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fee1699e-216f-408d-a7db-9a2a168d34b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.683 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a616c7-cb3e-4ff1-8453-dec4bcbf06a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434268, 'reachable_time': 37795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229324, 'error': None, 'target': 'ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d08f005d2\x2d033f\x2d4944\x2dbdc2\x2da0b577ce731c.mount: Deactivated successfully.
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.688 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08f005d2-033f-4944-bdc2-a0b577ce731c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:27:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:30.688 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc16f17-33e1-4c09-9984-58eb59b392cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.732 2 INFO nova.compute.manager [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.732 2 DEBUG oslo.service.loopingcall [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.732 2 DEBUG nova.compute.manager [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:27:30 compute-0 nova_compute[192810]: 2025-09-30 21:27:30.733 2 DEBUG nova.network.neutron [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.354 2 DEBUG nova.network.neutron [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.373 2 INFO nova.compute.manager [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Took 0.64 seconds to deallocate network for instance.
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.488 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.488 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.542 2 DEBUG nova.compute.provider_tree [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.562 2 DEBUG nova.scheduler.client.report [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.592 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.628 2 INFO nova.scheduler.client.report [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Deleted allocations for instance b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1
Sep 30 21:27:31 compute-0 nova_compute[192810]: 2025-09-30 21:27:31.742 2 DEBUG oslo_concurrency.lockutils [None req-54d07c83-dc51-4f90-be66-5a4a76560dcf 3fad9c6706e641d680597333f7ac5955 05e2ec21372f4b1eaa7a1b74ff5f8fdc - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.788 2 DEBUG nova.compute.manager [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.788 2 DEBUG oslo_concurrency.lockutils [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.789 2 DEBUG oslo_concurrency.lockutils [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.789 2 DEBUG oslo_concurrency.lockutils [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.790 2 DEBUG nova.compute.manager [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] No waiting events found dispatching network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.790 2 WARNING nova.compute.manager [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received unexpected event network-vif-plugged-2beca347-4057-4f9c-acfc-695b632cf134 for instance with vm_state deleted and task_state None.
Sep 30 21:27:32 compute-0 nova_compute[192810]: 2025-09-30 21:27:32.790 2 DEBUG nova.compute.manager [req-5aea4241-d66f-4c21-a7ad-6f8d6b1b843a req-e44058d5-68dd-4712-a70b-94f68989014e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Received event network-vif-deleted-2beca347-4057-4f9c-acfc-695b632cf134 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:27:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:35.295 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:27:35 compute-0 nova_compute[192810]: 2025-09-30 21:27:35.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:37 compute-0 nova_compute[192810]: 2025-09-30 21:27:37.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:38.733 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:27:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:39 compute-0 nova_compute[192810]: 2025-09-30 21:27:39.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:40 compute-0 podman[229326]: 2025-09-30 21:27:40.382809567 +0000 UTC m=+0.107901349 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Sep 30 21:27:40 compute-0 podman[229325]: 2025-09-30 21:27:40.39721619 +0000 UTC m=+0.123674726 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:27:40 compute-0 nova_compute[192810]: 2025-09-30 21:27:40.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:41 compute-0 nova_compute[192810]: 2025-09-30 21:27:41.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:42 compute-0 nova_compute[192810]: 2025-09-30 21:27:42.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:43 compute-0 nova_compute[192810]: 2025-09-30 21:27:43.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:43 compute-0 nova_compute[192810]: 2025-09-30 21:27:43.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:43 compute-0 nova_compute[192810]: 2025-09-30 21:27:43.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.592 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267650.591642, b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.593 2 INFO nova.compute.manager [-] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] VM Stopped (Lifecycle Event)
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.619 2 DEBUG nova.compute.manager [None req-e6b23752-9fff-4135-9883-646cdeceec01 - - - - - -] [instance: b7cbe72d-7ebb-48c0-962a-6ee8f73d41c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:45 compute-0 nova_compute[192810]: 2025-09-30 21:27:45.814 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:27:45 compute-0 podman[229371]: 2025-09-30 21:27:45.938894926 +0000 UTC m=+0.081732030 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.002 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.003 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.31929397583008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.003 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.004 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.101 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.102 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.140 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.156 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.174 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:27:46 compute-0 nova_compute[192810]: 2025-09-30 21:27:46.174 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:27:47 compute-0 nova_compute[192810]: 2025-09-30 21:27:47.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:47 compute-0 nova_compute[192810]: 2025-09-30 21:27:47.175 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:47 compute-0 nova_compute[192810]: 2025-09-30 21:27:47.785 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:47 compute-0 nova_compute[192810]: 2025-09-30 21:27:47.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:48 compute-0 nova_compute[192810]: 2025-09-30 21:27:48.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:48 compute-0 nova_compute[192810]: 2025-09-30 21:27:48.809 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:27:48 compute-0 nova_compute[192810]: 2025-09-30 21:27:48.809 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:27:48 compute-0 nova_compute[192810]: 2025-09-30 21:27:48.810 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:27:48 compute-0 nova_compute[192810]: 2025-09-30 21:27:48.827 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:27:49 compute-0 sshd[128205]: Timeout before authentication for connection from 49.64.169.153 to 38.102.83.69, pid = 227769
Sep 30 21:27:50 compute-0 nova_compute[192810]: 2025-09-30 21:27:50.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:52 compute-0 nova_compute[192810]: 2025-09-30 21:27:52.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:53 compute-0 podman[229390]: 2025-09-30 21:27:53.330500193 +0000 UTC m=+0.069111502 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:27:53 compute-0 podman[229391]: 2025-09-30 21:27:53.330619526 +0000 UTC m=+0.069142192 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:27:55 compute-0 nova_compute[192810]: 2025-09-30 21:27:55.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:57 compute-0 nova_compute[192810]: 2025-09-30 21:27:57.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:58 compute-0 nova_compute[192810]: 2025-09-30 21:27:58.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:27:59 compute-0 sshd-session[229434]: Invalid user root11 from 45.81.23.80 port 42338
Sep 30 21:27:59 compute-0 sshd-session[229434]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:27:59 compute-0 sshd-session[229434]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:28:00 compute-0 podman[229436]: 2025-09-30 21:28:00.346338393 +0000 UTC m=+0.075335185 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:28:00 compute-0 podman[229438]: 2025-09-30 21:28:00.348215331 +0000 UTC m=+0.081663735 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:28:00 compute-0 podman[229437]: 2025-09-30 21:28:00.362376989 +0000 UTC m=+0.089601506 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:28:00 compute-0 nova_compute[192810]: 2025-09-30 21:28:00.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:01 compute-0 sshd-session[229434]: Failed password for invalid user root11 from 45.81.23.80 port 42338 ssh2
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:02 compute-0 sshd-session[229434]: Received disconnect from 45.81.23.80 port 42338:11: Bye Bye [preauth]
Sep 30 21:28:02 compute-0 sshd-session[229434]: Disconnected from invalid user root11 45.81.23.80 port 42338 [preauth]
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.581 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.582 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.597 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.706 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.706 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.715 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.715 2 INFO nova.compute.claims [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.831 2 DEBUG nova.compute.provider_tree [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.863 2 DEBUG nova.scheduler.client.report [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.890 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.891 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.947 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.948 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.969 2 INFO nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:28:02 compute-0 nova_compute[192810]: 2025-09-30 21:28:02.987 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.142 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.144 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.145 2 INFO nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating image(s)
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.146 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.146 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.147 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.163 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.180 2 DEBUG nova.policy [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.213 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.214 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.214 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.225 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.273 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.274 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.307 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.308 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.308 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.373 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.374 2 DEBUG nova.virt.disk.api [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.374 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.432 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.433 2 DEBUG nova.virt.disk.api [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.434 2 DEBUG nova.objects.instance [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.468 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.468 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Ensure instance console log exists: /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.469 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.469 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.469 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:03 compute-0 nova_compute[192810]: 2025-09-30 21:28:03.722 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Successfully created port: 13997735-de00-4f74-bf02-5543abf3fa59 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.414 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Successfully updated port: 13997735-de00-4f74-bf02-5543abf3fa59 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.436 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.437 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.437 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.552 2 DEBUG nova.compute.manager [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-changed-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.552 2 DEBUG nova.compute.manager [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Refreshing instance network info cache due to event network-changed-13997735-de00-4f74-bf02-5543abf3fa59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:28:04 compute-0 nova_compute[192810]: 2025-09-30 21:28:04.553 2 DEBUG oslo_concurrency.lockutils [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:05 compute-0 nova_compute[192810]: 2025-09-30 21:28:05.539 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:28:05 compute-0 nova_compute[192810]: 2025-09-30 21:28:05.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.535 2 DEBUG nova.network.neutron [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Updating instance_info_cache with network_info: [{"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.555 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.556 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance network_info: |[{"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.556 2 DEBUG oslo_concurrency.lockutils [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.557 2 DEBUG nova.network.neutron [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Refreshing network info cache for port 13997735-de00-4f74-bf02-5543abf3fa59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.559 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start _get_guest_xml network_info=[{"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.563 2 WARNING nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.569 2 DEBUG nova.virt.libvirt.host [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.570 2 DEBUG nova.virt.libvirt.host [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.575 2 DEBUG nova.virt.libvirt.host [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.575 2 DEBUG nova.virt.libvirt.host [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.576 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.577 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.577 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.577 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.577 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.578 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.578 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.578 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.578 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.578 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.579 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.579 2 DEBUG nova.virt.hardware [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.582 2 DEBUG nova.virt.libvirt.vif [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-tempest.common.compute-instance-752562791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:03Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.582 2 DEBUG nova.network.os_vif_util [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.583 2 DEBUG nova.network.os_vif_util [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.583 2 DEBUG nova.objects.instance [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.597 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <uuid>3e2a9ae7-e454-4121-905d-4472ddfc2410</uuid>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <name>instance-00000043</name>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:name>tempest-tempest.common.compute-instance-752562791</nova:name>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:28:07</nova:creationTime>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         <nova:port uuid="13997735-de00-4f74-bf02-5543abf3fa59">
Sep 30 21:28:07 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <system>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="serial">3e2a9ae7-e454-4121-905d-4472ddfc2410</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="uuid">3e2a9ae7-e454-4121-905d-4472ddfc2410</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </system>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <os>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </os>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <features>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </features>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:88:bb:88"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <target dev="tap13997735-de"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/console.log" append="off"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <video>
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </video>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:28:07 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:28:07 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:28:07 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:28:07 compute-0 nova_compute[192810]: </domain>
Sep 30 21:28:07 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.598 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Preparing to wait for external event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.599 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.599 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.599 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.600 2 DEBUG nova.virt.libvirt.vif [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-tempest.common.compute-instance-752562791',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:03Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.600 2 DEBUG nova.network.os_vif_util [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.601 2 DEBUG nova.network.os_vif_util [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.603 2 DEBUG os_vif [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.604 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.604 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.606 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13997735-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.607 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13997735-de, col_values=(('external_ids', {'iface-id': '13997735-de00-4f74-bf02-5543abf3fa59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:bb:88', 'vm-uuid': '3e2a9ae7-e454-4121-905d-4472ddfc2410'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:07 compute-0 NetworkManager[51733]: <info>  [1759267687.6091] manager: (tap13997735-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.614 2 INFO os_vif [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de')
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.665 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.666 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.667 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:88:bb:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:07 compute-0 nova_compute[192810]: 2025-09-30 21:28:07.667 2 INFO nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Using config drive
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.052 2 INFO nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating config drive at /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.057 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkternvj8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.180 2 DEBUG oslo_concurrency.processutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkternvj8" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:08 compute-0 kernel: tap13997735-de: entered promiscuous mode
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.2492] manager: (tap13997735-de): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Sep 30 21:28:08 compute-0 ovn_controller[94912]: 2025-09-30T21:28:08Z|00231|binding|INFO|Claiming lport 13997735-de00-4f74-bf02-5543abf3fa59 for this chassis.
Sep 30 21:28:08 compute-0 ovn_controller[94912]: 2025-09-30T21:28:08Z|00232|binding|INFO|13997735-de00-4f74-bf02-5543abf3fa59: Claiming fa:16:3e:88:bb:88 10.100.0.13
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.2739] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.2756] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.276 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:bb:88 10.100.0.13'], port_security=['fa:16:3e:88:bb:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e2a9ae7-e454-4121-905d-4472ddfc2410', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c13c790-c8c7-422e-b90f-52f901ab71f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=13997735-de00-4f74-bf02-5543abf3fa59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.279 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 13997735-de00-4f74-bf02-5543abf3fa59 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.282 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:08 compute-0 systemd-udevd[229527]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:08 compute-0 systemd-machined[152794]: New machine qemu-31-instance-00000043.
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.297 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2d870619-4351-4e81-8f9e-2f52a5db6f6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.298 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.3029] device (tap13997735-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.300 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.301 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c660c9b7-37bc-496c-94db-c69e916d43d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.3051] device (tap13997735-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.305 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ce92a4-d1f7-4801-9781-9bdb9eb27b30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.316 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[860f22e0-41a1-4c79-87c7-e237a62aec87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.343 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5d11c82d-641a-492f-978a-33382d7c38e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.374 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[30db2337-3e35-44c7-ab78-12c65e20dd42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 systemd[1]: Started Virtual Machine qemu-31-instance-00000043.
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.393 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8618d5cf-b94b-4c67-8e4c-2a46bd37ddda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.3953] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.426 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c9466d6e-f929-411e-98cd-8b6fa162f8d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.431 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7ea61ab9-91f8-46b8-8509-dff83a1064f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.4539] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.459 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[70a8f213-2ed3-4fd0-92ea-fa7f36dd156f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 ovn_controller[94912]: 2025-09-30T21:28:08Z|00233|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 ovn-installed in OVS
Sep 30 21:28:08 compute-0 ovn_controller[94912]: 2025-09-30T21:28:08Z|00234|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 up in Southbound
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.477 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ea2a5ac0-03d7-4848-afa2-739dc18f38ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438407, 'reachable_time': 33617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229561, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.492 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2e8e27-760b-4d2d-9e1b-50cb12069434]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438407, 'tstamp': 438407}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229562, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.505 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9f1088-103d-4ce0-990a-9bef8770e95e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 66], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438407, 'reachable_time': 33617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229563, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.537 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dc97df3d-fe74-4879-9a32-bad0808e9015]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.593 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f1d28def-6dff-441c-bfe7-7e00cefb4296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.595 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.595 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.595 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:08 compute-0 NetworkManager[51733]: <info>  [1759267688.5981] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Sep 30 21:28:08 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.601 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:08 compute-0 ovn_controller[94912]: 2025-09-30T21:28:08Z|00235|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:08 compute-0 nova_compute[192810]: 2025-09-30 21:28:08.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.619 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.620 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7e8137fc-a1fe-4049-a209-6b093f06d2ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.621 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:28:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:08.622 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:28:09 compute-0 podman[229602]: 2025-09-30 21:28:09.005377246 +0000 UTC m=+0.045417699 container create a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:28:09 compute-0 systemd[1]: Started libpod-conmon-a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b.scope.
Sep 30 21:28:09 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:28:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47d491aa9b9b218bf15b3810d5d71b6c9f14cbee15ff64c811e9981d76c9ca61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:28:09 compute-0 podman[229602]: 2025-09-30 21:28:09.069389484 +0000 UTC m=+0.109429957 container init a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:28:09 compute-0 podman[229602]: 2025-09-30 21:28:09.077619282 +0000 UTC m=+0.117659735 container start a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:28:09 compute-0 podman[229602]: 2025-09-30 21:28:08.981288097 +0000 UTC m=+0.021328570 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:28:09 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [NOTICE]   (229621) : New worker (229623) forked
Sep 30 21:28:09 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [NOTICE]   (229621) : Loading success.
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.276 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267689.276497, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.277 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Started (Lifecycle Event)
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.298 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.301 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267689.2766907, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.301 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Paused (Lifecycle Event)
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.323 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.326 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:09 compute-0 nova_compute[192810]: 2025-09-30 21:28:09.347 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.576 2 DEBUG nova.network.neutron [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Updated VIF entry in instance network info cache for port 13997735-de00-4f74-bf02-5543abf3fa59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.576 2 DEBUG nova.network.neutron [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Updating instance_info_cache with network_info: [{"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.607 2 DEBUG oslo_concurrency.lockutils [req-976de5db-1a08-4eb5-9240-793e32b2bf74 req-a944184d-6ed8-4821-bfe9-535390dc338f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3e2a9ae7-e454-4121-905d-4472ddfc2410" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.672 2 DEBUG nova.compute.manager [req-e061c436-1769-4fdc-bc3e-deee4bab8f5d req-02697eb2-f9d3-47fb-8636-cabdd82e2133 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.672 2 DEBUG oslo_concurrency.lockutils [req-e061c436-1769-4fdc-bc3e-deee4bab8f5d req-02697eb2-f9d3-47fb-8636-cabdd82e2133 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.673 2 DEBUG oslo_concurrency.lockutils [req-e061c436-1769-4fdc-bc3e-deee4bab8f5d req-02697eb2-f9d3-47fb-8636-cabdd82e2133 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.673 2 DEBUG oslo_concurrency.lockutils [req-e061c436-1769-4fdc-bc3e-deee4bab8f5d req-02697eb2-f9d3-47fb-8636-cabdd82e2133 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.673 2 DEBUG nova.compute.manager [req-e061c436-1769-4fdc-bc3e-deee4bab8f5d req-02697eb2-f9d3-47fb-8636-cabdd82e2133 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Processing event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.674 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.676 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267690.6765199, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.676 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Resumed (Lifecycle Event)
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.678 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.680 2 INFO nova.virt.libvirt.driver [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance spawned successfully.
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.681 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.697 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.704 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.707 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.708 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.708 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.709 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.709 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.709 2 DEBUG nova.virt.libvirt.driver [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.744 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.778 2 INFO nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Took 7.64 seconds to spawn the instance on the hypervisor.
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.779 2 DEBUG nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.858 2 INFO nova.compute.manager [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Took 8.19 seconds to build instance.
Sep 30 21:28:10 compute-0 nova_compute[192810]: 2025-09-30 21:28:10.878 2 DEBUG oslo_concurrency.lockutils [None req-ca835784-86b5-42c9-934d-3fb1361851c5 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:11 compute-0 sshd[128205]: drop connection #0 from [49.64.169.153]:37582 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:28:11 compute-0 podman[229633]: 2025-09-30 21:28:11.325581398 +0000 UTC m=+0.056927299 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:28:11 compute-0 podman[229632]: 2025-09-30 21:28:11.355455963 +0000 UTC m=+0.089922323 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.741 2 DEBUG nova.compute.manager [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.741 2 DEBUG oslo_concurrency.lockutils [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.741 2 DEBUG oslo_concurrency.lockutils [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.742 2 DEBUG oslo_concurrency.lockutils [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.742 2 DEBUG nova.compute.manager [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:12 compute-0 nova_compute[192810]: 2025-09-30 21:28:12.742 2 WARNING nova.compute.manager [req-12ecbeb8-782b-4c68-a9b5-4e568325c224 req-1d0ac9b3-c0a2-4313-bace-780435da6e03 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received unexpected event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with vm_state active and task_state None.
Sep 30 21:28:14 compute-0 nova_compute[192810]: 2025-09-30 21:28:14.712 2 INFO nova.compute.manager [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Rebuilding instance
Sep 30 21:28:14 compute-0 nova_compute[192810]: 2025-09-30 21:28:14.999 2 DEBUG nova.compute.manager [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.075 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_requests' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.087 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.102 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.182 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.195 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:28:15 compute-0 nova_compute[192810]: 2025-09-30 21:28:15.198 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:28:16 compute-0 podman[229679]: 2025-09-30 21:28:16.320400179 +0000 UTC m=+0.059275719 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 21:28:17 compute-0 nova_compute[192810]: 2025-09-30 21:28:17.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:17 compute-0 nova_compute[192810]: 2025-09-30 21:28:17.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:17 compute-0 nova_compute[192810]: 2025-09-30 21:28:17.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-0 nova_compute[192810]: 2025-09-30 21:28:22.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-0 nova_compute[192810]: 2025-09-30 21:28:22.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:22 compute-0 ovn_controller[94912]: 2025-09-30T21:28:22Z|00236|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:22 compute-0 ovn_controller[94912]: 2025-09-30T21:28:22Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:bb:88 10.100.0.13
Sep 30 21:28:22 compute-0 ovn_controller[94912]: 2025-09-30T21:28:22Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:bb:88 10.100.0.13
Sep 30 21:28:22 compute-0 nova_compute[192810]: 2025-09-30 21:28:22.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:24 compute-0 podman[229713]: 2025-09-30 21:28:24.335475187 +0000 UTC m=+0.054974580 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Sep 30 21:28:24 compute-0 podman[229712]: 2025-09-30 21:28:24.335500048 +0000 UTC m=+0.058778477 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:28:25 compute-0 nova_compute[192810]: 2025-09-30 21:28:25.245 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 kernel: tap13997735-de (unregistering): left promiscuous mode
Sep 30 21:28:27 compute-0 NetworkManager[51733]: <info>  [1759267707.4090] device (tap13997735-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 ovn_controller[94912]: 2025-09-30T21:28:27Z|00237|binding|INFO|Releasing lport 13997735-de00-4f74-bf02-5543abf3fa59 from this chassis (sb_readonly=0)
Sep 30 21:28:27 compute-0 ovn_controller[94912]: 2025-09-30T21:28:27Z|00238|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 down in Southbound
Sep 30 21:28:27 compute-0 ovn_controller[94912]: 2025-09-30T21:28:27Z|00239|binding|INFO|Removing iface tap13997735-de ovn-installed in OVS
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.427 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:bb:88 10.100.0.13'], port_security=['fa:16:3e:88:bb:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e2a9ae7-e454-4121-905d-4472ddfc2410', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5c13c790-c8c7-422e-b90f-52f901ab71f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=13997735-de00-4f74-bf02-5543abf3fa59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.428 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 13997735-de00-4f74-bf02-5543abf3fa59 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.430 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.431 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[83dd5cf5-8fd7-4afa-ba44-97e361f2960f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.431 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Deactivated successfully.
Sep 30 21:28:27 compute-0 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000043.scope: Consumed 12.324s CPU time.
Sep 30 21:28:27 compute-0 systemd-machined[152794]: Machine qemu-31-instance-00000043 terminated.
Sep 30 21:28:27 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [NOTICE]   (229621) : haproxy version is 2.8.14-c23fe91
Sep 30 21:28:27 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [NOTICE]   (229621) : path to executable is /usr/sbin/haproxy
Sep 30 21:28:27 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [ALERT]    (229621) : Current worker (229623) exited with code 143 (Terminated)
Sep 30 21:28:27 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229617]: [WARNING]  (229621) : All workers exited. Exiting... (0)
Sep 30 21:28:27 compute-0 systemd[1]: libpod-a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b.scope: Deactivated successfully.
Sep 30 21:28:27 compute-0 podman[229780]: 2025-09-30 21:28:27.572278305 +0000 UTC m=+0.048488326 container died a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:28:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-47d491aa9b9b218bf15b3810d5d71b6c9f14cbee15ff64c811e9981d76c9ca61-merged.mount: Deactivated successfully.
Sep 30 21:28:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:28:27 compute-0 podman[229780]: 2025-09-30 21:28:27.606265734 +0000 UTC m=+0.082475765 container cleanup a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 systemd[1]: libpod-conmon-a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b.scope: Deactivated successfully.
Sep 30 21:28:27 compute-0 podman[229808]: 2025-09-30 21:28:27.668314423 +0000 UTC m=+0.039831228 container remove a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.675 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f47c4ed8-9b83-46f0-a549-da0d5fed58ee]: (4, ('Tue Sep 30 09:28:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b)\na293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b\nTue Sep 30 09:28:27 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (a293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b)\na293128c4b7d8483be503ad68d17ab65227d7fe1e8178cd6016a7ae2c34b8c6b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.676 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bcb62a82-df64-4216-aacb-d86e3be14e8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.678 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.698 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e55c7d-f333-4c45-a509-4c99ce61cb2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.739 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[afacf8a1-d114-49c2-90fa-4c3ef7643d47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.740 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6281edd8-62ca-4275-83b8-55ea9a9f56e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.755 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[73d8f401-fca9-4227-8f01-8c238a57fb5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438399, 'reachable_time': 24050, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229842, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.757 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.758 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[95e1ed93-25d3-4662-a9d9-3cd609f3d2a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.795 2 DEBUG nova.compute.manager [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.796 2 DEBUG oslo_concurrency.lockutils [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.796 2 DEBUG oslo_concurrency.lockutils [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.796 2 DEBUG oslo_concurrency.lockutils [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.796 2 DEBUG nova.compute.manager [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.797 2 WARNING nova.compute.manager [req-a9e3a3b4-5f20-402c-9c4a-b6cf7cd451a1 req-02481a0a-7c22-43f0-a30d-fb4618dac6b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received unexpected event network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with vm_state active and task_state rebuilding.
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.843 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:27 compute-0 nova_compute[192810]: 2025-09-30 21:28:27.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:27.844 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.258 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance shutdown successfully after 13 seconds.
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.263 2 INFO nova.virt.libvirt.driver [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance destroyed successfully.
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.268 2 INFO nova.virt.libvirt.driver [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance destroyed successfully.
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.268 2 DEBUG nova.virt.libvirt.vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-ServerActionsTestJSON-server-127490077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:14Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.269 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.269 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.270 2 DEBUG os_vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.271 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13997735-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.297 2 INFO os_vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de')
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.298 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Deleting instance files /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410_del
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.298 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Deletion of /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410_del complete
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.559 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.559 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating image(s)
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.560 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.560 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.561 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.573 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.629 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.630 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.631 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.642 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.697 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.698 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.733 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.734 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.735 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.786 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.787 2 DEBUG nova.virt.disk.api [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.787 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.839 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.840 2 DEBUG nova.virt.disk.api [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.840 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.841 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Ensure instance console log exists: /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.841 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.841 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.842 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.843 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start _get_guest_xml network_info=[{"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.849 2 WARNING nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.856 2 DEBUG nova.virt.libvirt.host [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.856 2 DEBUG nova.virt.libvirt.host [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.859 2 DEBUG nova.virt.libvirt.host [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.860 2 DEBUG nova.virt.libvirt.host [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.861 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.861 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.861 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.862 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.862 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.862 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.862 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.862 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.863 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.863 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.863 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.863 2 DEBUG nova.virt.hardware [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.863 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.880 2 DEBUG nova.virt.libvirt.vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-ServerActionsTestJSON-server-127490077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:28Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.880 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.881 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.882 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <uuid>3e2a9ae7-e454-4121-905d-4472ddfc2410</uuid>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <name>instance-00000043</name>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestJSON-server-127490077</nova:name>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:28:28</nova:creationTime>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         <nova:port uuid="13997735-de00-4f74-bf02-5543abf3fa59">
Sep 30 21:28:28 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <system>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="serial">3e2a9ae7-e454-4121-905d-4472ddfc2410</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="uuid">3e2a9ae7-e454-4121-905d-4472ddfc2410</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </system>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <os>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </os>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <features>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </features>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:88:bb:88"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <target dev="tap13997735-de"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/console.log" append="off"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <video>
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </video>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:28:28 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:28:28 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:28:28 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:28:28 compute-0 nova_compute[192810]: </domain>
Sep 30 21:28:28 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.883 2 DEBUG nova.compute.manager [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Preparing to wait for external event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.883 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.883 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.884 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.884 2 DEBUG nova.virt.libvirt.vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-ServerActionsTestJSON-server-127490077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:28Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.884 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.885 2 DEBUG nova.network.os_vif_util [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.885 2 DEBUG os_vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13997735-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.890 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap13997735-de, col_values=(('external_ids', {'iface-id': '13997735-de00-4f74-bf02-5543abf3fa59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:bb:88', 'vm-uuid': '3e2a9ae7-e454-4121-905d-4472ddfc2410'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 NetworkManager[51733]: <info>  [1759267708.8930] manager: (tap13997735-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.897 2 INFO os_vif [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de')
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.945 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.946 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.946 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:88:bb:88, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.946 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Using config drive
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.963 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:28 compute-0 nova_compute[192810]: 2025-09-30 21:28:28.992 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'keypairs' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.525 2 INFO nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Creating config drive at /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.529 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpes010tf1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.655 2 DEBUG oslo_concurrency.processutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpes010tf1" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:29 compute-0 kernel: tap13997735-de: entered promiscuous mode
Sep 30 21:28:29 compute-0 systemd-udevd[229759]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:29 compute-0 NetworkManager[51733]: <info>  [1759267709.7058] manager: (tap13997735-de): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Sep 30 21:28:29 compute-0 ovn_controller[94912]: 2025-09-30T21:28:29Z|00240|binding|INFO|Claiming lport 13997735-de00-4f74-bf02-5543abf3fa59 for this chassis.
Sep 30 21:28:29 compute-0 ovn_controller[94912]: 2025-09-30T21:28:29Z|00241|binding|INFO|13997735-de00-4f74-bf02-5543abf3fa59: Claiming fa:16:3e:88:bb:88 10.100.0.13
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-0 NetworkManager[51733]: <info>  [1759267709.7157] device (tap13997735-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:29 compute-0 NetworkManager[51733]: <info>  [1759267709.7172] device (tap13997735-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:29 compute-0 ovn_controller[94912]: 2025-09-30T21:28:29Z|00242|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 ovn-installed in OVS
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:29 compute-0 ovn_controller[94912]: 2025-09-30T21:28:29Z|00243|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 up in Southbound
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.736 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:bb:88 10.100.0.13'], port_security=['fa:16:3e:88:bb:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e2a9ae7-e454-4121-905d-4472ddfc2410', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5c13c790-c8c7-422e-b90f-52f901ab71f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=13997735-de00-4f74-bf02-5543abf3fa59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.737 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 13997735-de00-4f74-bf02-5543abf3fa59 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.739 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:29 compute-0 systemd-machined[152794]: New machine qemu-32-instance-00000043.
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.749 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[87e31b19-4a85-4514-b20e-db8e30ff36df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.749 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.751 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.751 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42a9010c-0a64-47a3-ac64-f60d27b4cf8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.751 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[92a6b1ee-e8be-4922-868b-b0c69f7c808e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 systemd[1]: Started Virtual Machine qemu-32-instance-00000043.
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.762 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[96757412-cc9f-457b-81be-4e31bc7736a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.784 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5e1330-2c66-41be-8bf1-aa5606eb4af9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.813 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f43cd502-6b34-4f5f-bd46-23699c3112cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.818 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a73b6059-56ab-4cee-85f1-0972741b605f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 NetworkManager[51733]: <info>  [1759267709.8199] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/110)
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.847 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cc20ad4d-a8b7-4764-ae6f-d4332713f641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.849 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[74c1d938-2141-4188-92e4-3dfa72bc2cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 NetworkManager[51733]: <info>  [1759267709.8678] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.872 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4cc503-acf1-4079-aecf-fbac915e7afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.888 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[24522f9f-4f75-44dd-a410-ca7a4f8a1a05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440549, 'reachable_time': 41006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229909, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.897 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.897 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.897 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.898 2 DEBUG oslo_concurrency.lockutils [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:29 compute-0 nova_compute[192810]: 2025-09-30 21:28:29.898 2 DEBUG nova.compute.manager [req-07f5be23-e581-4829-8df0-a734d0da30bc req-03ac51b3-8a65-40f2-ad4a-4d3ad96daf61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Processing event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.902 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b01d0e09-5bf1-48a9-ad84-f59086981910]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 440549, 'tstamp': 440549}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229910, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.917 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[444a95fb-4b5c-4245-b119-4322c17811eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440549, 'reachable_time': 41006, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229911, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:29.944 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4a218d-8b1b-42a8-bcb7-ab1ae64ada20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.003 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0f7460-cac5-47b5-8772-f9a5a37ecb3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.004 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.005 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.005 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:30 compute-0 NetworkManager[51733]: <info>  [1759267710.0074] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:30 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.010 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:30 compute-0 ovn_controller[94912]: 2025-09-30T21:28:30Z|00244|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.026 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.027 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[02e9071f-d774-4b0c-8fae-865d28eb35aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.028 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:28:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:30.028 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:28:30 compute-0 podman[229950]: 2025-09-30 21:28:30.365948034 +0000 UTC m=+0.043885230 container create 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:28:30 compute-0 systemd[1]: Started libpod-conmon-485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc.scope.
Sep 30 21:28:30 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:28:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/698bc3544ab94a2419d135c1a26f9b1167b5ded658562a882b7396d5b48c7d97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:28:30 compute-0 podman[229950]: 2025-09-30 21:28:30.342585253 +0000 UTC m=+0.020522469 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:28:30 compute-0 podman[229950]: 2025-09-30 21:28:30.445805712 +0000 UTC m=+0.123742938 container init 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:28:30 compute-0 podman[229950]: 2025-09-30 21:28:30.451443645 +0000 UTC m=+0.129380841 container start 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:28:30 compute-0 podman[229968]: 2025-09-30 21:28:30.464628068 +0000 UTC m=+0.054290643 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:28:30 compute-0 podman[229963]: 2025-09-30 21:28:30.467347897 +0000 UTC m=+0.062452050 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:30 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [NOTICE]   (230021) : New worker (230034) forked
Sep 30 21:28:30 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [NOTICE]   (230021) : Loading success.
Sep 30 21:28:30 compute-0 podman[229967]: 2025-09-30 21:28:30.477044742 +0000 UTC m=+0.068742799 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.544 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 3e2a9ae7-e454-4121-905d-4472ddfc2410 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.544 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267710.5436077, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.544 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Started (Lifecycle Event)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.546 2 DEBUG nova.compute.manager [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.549 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.552 2 INFO nova.virt.libvirt.driver [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance spawned successfully.
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.552 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.575 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.576 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.576 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.577 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.577 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.578 2 DEBUG nova.virt.libvirt.driver [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.581 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.583 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.625 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.626 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267710.544673, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.626 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Paused (Lifecycle Event)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.659 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.662 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267710.549106, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.662 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Resumed (Lifecycle Event)
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.671 2 DEBUG nova.compute.manager [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.696 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.699 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.723 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.758 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.759 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.759 2 DEBUG nova.objects.instance [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:28:30 compute-0 nova_compute[192810]: 2025-09-30 21:28:30.837 2 DEBUG oslo_concurrency.lockutils [None req-a6e7429e-b7d7-4031-998c-0f3b1ff6981c 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:31 compute-0 nova_compute[192810]: 2025-09-30 21:28:31.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.232 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.233 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.233 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.233 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.234 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.234 2 WARNING nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received unexpected event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with vm_state active and task_state None.
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.234 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.235 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.235 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.235 2 DEBUG oslo_concurrency.lockutils [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.236 2 DEBUG nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:32 compute-0 nova_compute[192810]: 2025-09-30 21:28:32.236 2 WARNING nova.compute.manager [req-30a8d192-a5f3-442a-bdfd-e52efa32ddd5 req-befd313f-44d5-405c-ac26-bd81618a54a4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received unexpected event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with vm_state active and task_state None.
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.822 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.823 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.823 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.824 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.824 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.834 2 INFO nova.compute.manager [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Terminating instance
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.845 2 DEBUG nova.compute.manager [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.846 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:33 compute-0 kernel: tap13997735-de (unregistering): left promiscuous mode
Sep 30 21:28:33 compute-0 NetworkManager[51733]: <info>  [1759267713.8699] device (tap13997735-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:33 compute-0 ovn_controller[94912]: 2025-09-30T21:28:33Z|00245|binding|INFO|Releasing lport 13997735-de00-4f74-bf02-5543abf3fa59 from this chassis (sb_readonly=0)
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:33 compute-0 ovn_controller[94912]: 2025-09-30T21:28:33Z|00246|binding|INFO|Setting lport 13997735-de00-4f74-bf02-5543abf3fa59 down in Southbound
Sep 30 21:28:33 compute-0 ovn_controller[94912]: 2025-09-30T21:28:33Z|00247|binding|INFO|Removing iface tap13997735-de ovn-installed in OVS
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.892 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:bb:88 10.100.0.13'], port_security=['fa:16:3e:88:bb:88 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3e2a9ae7-e454-4121-905d-4472ddfc2410', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5c13c790-c8c7-422e-b90f-52f901ab71f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=13997735-de00-4f74-bf02-5543abf3fa59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.893 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 13997735-de00-4f74-bf02-5543abf3fa59 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.894 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.895 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c57440ae-1a4e-4bb6-9300-8c7017a20ff7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:33.896 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:28:33 compute-0 nova_compute[192810]: 2025-09-30 21:28:33.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:33 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000043.scope: Deactivated successfully.
Sep 30 21:28:33 compute-0 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d00000043.scope: Consumed 4.050s CPU time.
Sep 30 21:28:33 compute-0 systemd-machined[152794]: Machine qemu-32-instance-00000043 terminated.
Sep 30 21:28:34 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [NOTICE]   (230021) : haproxy version is 2.8.14-c23fe91
Sep 30 21:28:34 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [NOTICE]   (230021) : path to executable is /usr/sbin/haproxy
Sep 30 21:28:34 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [WARNING]  (230021) : Exiting Master process...
Sep 30 21:28:34 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [ALERT]    (230021) : Current worker (230034) exited with code 143 (Terminated)
Sep 30 21:28:34 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[229984]: [WARNING]  (230021) : All workers exited. Exiting... (0)
Sep 30 21:28:34 compute-0 systemd[1]: libpod-485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc.scope: Deactivated successfully.
Sep 30 21:28:34 compute-0 podman[230068]: 2025-09-30 21:28:34.024931333 +0000 UTC m=+0.040844444 container died 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:28:34 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc-userdata-shm.mount: Deactivated successfully.
Sep 30 21:28:34 compute-0 systemd[1]: var-lib-containers-storage-overlay-698bc3544ab94a2419d135c1a26f9b1167b5ded658562a882b7396d5b48c7d97-merged.mount: Deactivated successfully.
Sep 30 21:28:34 compute-0 podman[230068]: 2025-09-30 21:28:34.062856291 +0000 UTC m=+0.078769402 container cleanup 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:28:34 compute-0 systemd[1]: libpod-conmon-485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc.scope: Deactivated successfully.
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.115 2 INFO nova.virt.libvirt.driver [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Instance destroyed successfully.
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.119 2 DEBUG nova.objects.instance [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 3e2a9ae7-e454-4121-905d-4472ddfc2410 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:34 compute-0 podman[230102]: 2025-09-30 21:28:34.138577995 +0000 UTC m=+0.056154530 container remove 485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.142 2 DEBUG nova.virt.libvirt.vif [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:28:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-752562791',display_name='tempest-ServerActionsTestJSON-server-127490077',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-752562791',id=67,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-ytwo185j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:30Z,user_data=None,user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3e2a9ae7-e454-4121-905d-4472ddfc2410,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.143 2 DEBUG nova.network.os_vif_util [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "13997735-de00-4f74-bf02-5543abf3fa59", "address": "fa:16:3e:88:bb:88", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap13997735-de", "ovs_interfaceid": "13997735-de00-4f74-bf02-5543abf3fa59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.144 2 DEBUG nova.network.os_vif_util [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.144 2 DEBUG os_vif [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.146 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[17336fa4-ef14-4313-a851-b83fc3ab0237]: (4, ('Tue Sep 30 09:28:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc)\n485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc\nTue Sep 30 09:28:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc)\n485dda209575519b8de46d8f8882005bd9067b22b8186b36ade8ec29c1329ccc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.147 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13997735-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.148 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[372a8311-245f-4470-8108-dd4fa2c187fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.149 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.154 2 INFO os_vif [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:bb:88,bridge_name='br-int',has_traffic_filtering=True,id=13997735-de00-4f74-bf02-5543abf3fa59,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap13997735-de')
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.155 2 INFO nova.virt.libvirt.driver [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Deleting instance files /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410_del
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.156 2 INFO nova.virt.libvirt.driver [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Deletion of /var/lib/nova/instances/3e2a9ae7-e454-4121-905d-4472ddfc2410_del complete
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.169 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[db317eea-0727-458d-ac7c-dd657cce269e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.206 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7cdb19a0-9ad9-48dc-9934-4982afff6fd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.207 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed272051-e56e-49ab-80b2-32e532f1d904]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b362d23-df4d-4b73-973a-be8b94c24fa1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 440543, 'reachable_time': 20875, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230128, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.225 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:28:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:34.226 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc0cc98-4a9d-4031-bdf8-9ab29a4fcc3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:34 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.256 2 INFO nova.compute.manager [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.257 2 DEBUG oslo.service.loopingcall [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.257 2 DEBUG nova.compute.manager [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.258 2 DEBUG nova.network.neutron [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.447 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.448 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.448 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.448 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.449 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.449 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-unplugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.449 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.450 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.450 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.450 2 DEBUG oslo_concurrency.lockutils [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.450 2 DEBUG nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] No waiting events found dispatching network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:34 compute-0 nova_compute[192810]: 2025-09-30 21:28:34.451 2 WARNING nova.compute.manager [req-25c49c08-4554-481a-bd2f-9dee3ee59b5a req-c3717e4f-13c1-4402-987d-572d66ac8c37 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received unexpected event network-vif-plugged-13997735-de00-4f74-bf02-5543abf3fa59 for instance with vm_state active and task_state deleting.
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.119 2 DEBUG nova.network.neutron [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.168 2 INFO nova.compute.manager [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Took 0.91 seconds to deallocate network for instance.
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.254 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.255 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.316 2 DEBUG nova.compute.provider_tree [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.340 2 DEBUG nova.scheduler.client.report [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.366 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.397 2 INFO nova.scheduler.client.report [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Deleted allocations for instance 3e2a9ae7-e454-4121-905d-4472ddfc2410
Sep 30 21:28:35 compute-0 nova_compute[192810]: 2025-09-30 21:28:35.523 2 DEBUG oslo_concurrency.lockutils [None req-10411128-d26a-4b54-bfdc-c6b835707318 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3e2a9ae7-e454-4121-905d-4472ddfc2410" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:36 compute-0 nova_compute[192810]: 2025-09-30 21:28:36.570 2 DEBUG nova.compute.manager [req-07e3ec74-08fd-4d74-bfad-98f61790f2b6 req-926de8b9-cb98-4c3d-bc17-db55904eaac3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Received event network-vif-deleted-13997735-de00-4f74-bf02-5543abf3fa59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:37 compute-0 nova_compute[192810]: 2025-09-30 21:28:37.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:37 compute-0 nova_compute[192810]: 2025-09-30 21:28:37.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:37 compute-0 nova_compute[192810]: 2025-09-30 21:28:37.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.065 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.066 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.090 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.202 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.203 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.211 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.211 2 INFO nova.compute.claims [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.336 2 DEBUG nova.compute.provider_tree [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.355 2 DEBUG nova.scheduler.client.report [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.382 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.382 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.442 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.443 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.457 2 INFO nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.480 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.647 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.648 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.649 2 INFO nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Creating image(s)
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.649 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.649 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.650 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.662 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.752 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.754 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.754 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.766 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.818 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.820 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.843 2 DEBUG nova.policy [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b7524018e934e07b3b6e81fdcf852aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e09247c56a847ac8eb9b104a34420f0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.856 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.857 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.857 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.912 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.914 2 DEBUG nova.virt.disk.api [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Checking if we can resize image /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.914 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.970 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.971 2 DEBUG nova.virt.disk.api [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Cannot resize image /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.972 2 DEBUG nova.objects.instance [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lazy-loading 'migration_context' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.993 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.994 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Ensure instance console log exists: /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.994 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.995 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:39 compute-0 nova_compute[192810]: 2025-09-30 21:28:39.995 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:40 compute-0 nova_compute[192810]: 2025-09-30 21:28:40.455 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Successfully created port: 12eb0d00-3d96-4777-8dc9-ba184484c7f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:28:40 compute-0 nova_compute[192810]: 2025-09-30 21:28:40.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:41 compute-0 nova_compute[192810]: 2025-09-30 21:28:41.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:42 compute-0 podman[230146]: 2025-09-30 21:28:42.382404844 +0000 UTC m=+0.118222999 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:28:42 compute-0 podman[230145]: 2025-09-30 21:28:42.400203414 +0000 UTC m=+0.128816747 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.456 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Successfully updated port: 12eb0d00-3d96-4777-8dc9-ba184484c7f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.477 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.477 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquired lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.477 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:42 compute-0 nova_compute[192810]: 2025-09-30 21:28:42.623 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.705 2 DEBUG nova.network.neutron [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.752 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Releasing lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.753 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Instance network_info: |[{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.758 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Start _get_guest_xml network_info=[{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.765 2 WARNING nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.769 2 DEBUG nova.virt.libvirt.host [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.770 2 DEBUG nova.virt.libvirt.host [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.773 2 DEBUG nova.virt.libvirt.host [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.774 2 DEBUG nova.virt.libvirt.host [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.775 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.776 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.776 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.777 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.777 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.778 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.778 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.779 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.779 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.780 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.780 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.780 2 DEBUG nova.virt.hardware [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.786 2 DEBUG nova.virt.libvirt.vif [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:39Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.787 2 DEBUG nova.network.os_vif_util [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.788 2 DEBUG nova.network.os_vif_util [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.789 2 DEBUG nova.objects.instance [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lazy-loading 'pci_devices' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.792 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.808 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <uuid>ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5</uuid>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <name>instance-00000046</name>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:name>tempest-AttachInterfacesV270Test-server-1621545046</nova:name>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:28:43</nova:creationTime>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:user uuid="5b7524018e934e07b3b6e81fdcf852aa">tempest-AttachInterfacesV270Test-1546994059-project-member</nova:user>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:project uuid="9e09247c56a847ac8eb9b104a34420f0">tempest-AttachInterfacesV270Test-1546994059</nova:project>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         <nova:port uuid="12eb0d00-3d96-4777-8dc9-ba184484c7f3">
Sep 30 21:28:43 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <system>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="serial">ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="uuid">ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </system>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <os>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </os>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <features>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </features>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.config"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:d1:8c:fd"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <target dev="tap12eb0d00-3d"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/console.log" append="off"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <video>
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </video>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:28:43 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:28:43 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:28:43 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:28:43 compute-0 nova_compute[192810]: </domain>
Sep 30 21:28:43 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.810 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Preparing to wait for external event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.810 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.810 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.811 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.811 2 DEBUG nova.virt.libvirt.vif [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:28:39Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.812 2 DEBUG nova.network.os_vif_util [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.813 2 DEBUG nova.network.os_vif_util [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.813 2 DEBUG os_vif [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.814 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.814 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.817 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap12eb0d00-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.818 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap12eb0d00-3d, col_values=(('external_ids', {'iface-id': '12eb0d00-3d96-4777-8dc9-ba184484c7f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:8c:fd', 'vm-uuid': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:43 compute-0 NetworkManager[51733]: <info>  [1759267723.8200] manager: (tap12eb0d00-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.826 2 INFO os_vif [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d')
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.881 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.881 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.881 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No VIF found with MAC fa:16:3e:d1:8c:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:43 compute-0 nova_compute[192810]: 2025-09-30 21:28:43.882 2 INFO nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Using config drive
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.907 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5', 'name': 'tempest-AttachInterfacesV270Test-server-1621545046', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000046', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '9e09247c56a847ac8eb9b104a34420f0', 'user_id': '5b7524018e934e07b3b6e81fdcf852aa', 'hostId': '8638d139c2086ccb6f36435e3c30d4faba8746577bbea0966694266c', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.908 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.909 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.910 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.910 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.911 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.912 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.913 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.914 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.914 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.914 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.914 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>]
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.914 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.915 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.916 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.916 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.917 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.917 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.917 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>]
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.918 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.919 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.919 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.921 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.922 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.923 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.923 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.923 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>]
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.925 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.926 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.926 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.926 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-AttachInterfacesV270Test-server-1621545046>]
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.927 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:28:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:28:43.927 12 DEBUG ceilometer.compute.pollsters [-] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000046, id=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.225 2 DEBUG nova.compute.manager [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-changed-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.225 2 DEBUG nova.compute.manager [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Refreshing instance network info cache due to event network-changed-12eb0d00-3d96-4777-8dc9-ba184484c7f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.225 2 DEBUG oslo_concurrency.lockutils [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.226 2 DEBUG oslo_concurrency.lockutils [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.226 2 DEBUG nova.network.neutron [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Refreshing network info cache for port 12eb0d00-3d96-4777-8dc9-ba184484c7f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:44 compute-0 nova_compute[192810]: 2025-09-30 21:28:44.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.700 2 INFO nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Creating config drive at /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.config
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.704 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpydsk40tt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.827 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.828 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.828 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.828 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.829 2 DEBUG oslo_concurrency.processutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpydsk40tt" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:45 compute-0 kernel: tap12eb0d00-3d: entered promiscuous mode
Sep 30 21:28:45 compute-0 NetworkManager[51733]: <info>  [1759267725.8855] manager: (tap12eb0d00-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:45 compute-0 ovn_controller[94912]: 2025-09-30T21:28:45Z|00248|binding|INFO|Claiming lport 12eb0d00-3d96-4777-8dc9-ba184484c7f3 for this chassis.
Sep 30 21:28:45 compute-0 ovn_controller[94912]: 2025-09-30T21:28:45Z|00249|binding|INFO|12eb0d00-3d96-4777-8dc9-ba184484c7f3: Claiming fa:16:3e:d1:8c:fd 10.100.0.7
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.901 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:8c:fd 10.100.0.7'], port_security=['fa:16:3e:d1:8c:fd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e09247c56a847ac8eb9b104a34420f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c058edda-1206-4967-a260-ebab94597051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3657ff2-06aa-4f19-a9a9-2102b82bcb72, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=12eb0d00-3d96-4777-8dc9-ba184484c7f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.902 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 12eb0d00-3d96-4777-8dc9-ba184484c7f3 in datapath 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 bound to our chassis
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.903 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422
Sep 30 21:28:45 compute-0 systemd-udevd[230211]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.915 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1285780d-27e0-453d-81c6-e855b51eb481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.916 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0d9a0e7d-41 in ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.918 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0d9a0e7d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.919 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dd305994-a1a5-4ccc-a289-fc58a7cbee13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.919 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b442f2e0-5cee-497e-aaf5-690060a407a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 NetworkManager[51733]: <info>  [1759267725.9304] device (tap12eb0d00-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.929 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[fac04304-1057-45e3-a7aa-18f3a25dd1db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 NetworkManager[51733]: <info>  [1759267725.9316] device (tap12eb0d00-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:45 compute-0 systemd-machined[152794]: New machine qemu-33-instance-00000046.
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:45 compute-0 ovn_controller[94912]: 2025-09-30T21:28:45Z|00250|binding|INFO|Setting lport 12eb0d00-3d96-4777-8dc9-ba184484c7f3 ovn-installed in OVS
Sep 30 21:28:45 compute-0 ovn_controller[94912]: 2025-09-30T21:28:45Z|00251|binding|INFO|Setting lport 12eb0d00-3d96-4777-8dc9-ba184484c7f3 up in Southbound
Sep 30 21:28:45 compute-0 nova_compute[192810]: 2025-09-30 21:28:45.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:45 compute-0 systemd[1]: Started Virtual Machine qemu-33-instance-00000046.
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.955 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cb23de38-9452-4560-ae75-5e4ea8fdf4ba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.982 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[249d8c5d-af75-4803-8a99-6a879080a2d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:45.986 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[48b0f8ee-08c6-44bf-b33a-483aeae90ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:45 compute-0 NetworkManager[51733]: <info>  [1759267725.9876] manager: (tap0d9a0e7d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/114)
Sep 30 21:28:45 compute-0 systemd-udevd[230216]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.022 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8d74f3-7996-469f-82f7-19331b183f15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.026 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3f386c-2921-4655-b6a7-8c83103b897c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 NetworkManager[51733]: <info>  [1759267726.0478] device (tap0d9a0e7d-40): carrier: link connected
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.054 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf5a3a1-5cef-4812-a98d-46f70cbef64d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.067 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b15df61b-0e5b-4332-8e26-83ac66e5ff66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d9a0e7d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:b2:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442167, 'reachable_time': 28634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230246, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.080 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.083 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d42e8509-6369-4116-a9df-54c0b344b4ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:b292'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442167, 'tstamp': 442167}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230247, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.100 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[929d3556-7712-4366-8572-34c7680e5cec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d9a0e7d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:b2:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442167, 'reachable_time': 28634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230249, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.128 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd4dfab-fce3-4265-872b-c22d19e0b2eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.134 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.135 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.179 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d5564d09-d907-41fc-a57a-a6e6bba84789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.180 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d9a0e7d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.180 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.181 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d9a0e7d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:46 compute-0 NetworkManager[51733]: <info>  [1759267726.1831] manager: (tap0d9a0e7d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:46 compute-0 kernel: tap0d9a0e7d-40: entered promiscuous mode
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.186 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d9a0e7d-40, col_values=(('external_ids', {'iface-id': 'beda616e-dd79-43fd-ab3d-456c618879b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.190 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0d9a0e7d-4f10-48ac-8e5d-13aaaef30422.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0d9a0e7d-4f10-48ac-8e5d-13aaaef30422.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:28:46 compute-0 ovn_controller[94912]: 2025-09-30T21:28:46Z|00252|binding|INFO|Releasing lport beda616e-dd79-43fd-ab3d-456c618879b3 from this chassis (sb_readonly=0)
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.191 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9dc6e4-6808-417c-8e1a-7fe71750c71d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.191 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/0d9a0e7d-4f10-48ac-8e5d-13aaaef30422.pid.haproxy
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:28:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:46.193 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'env', 'PROCESS_TAG=haproxy-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0d9a0e7d-4f10-48ac-8e5d-13aaaef30422.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.200 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.336 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.337 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5686MB free_disk=73.31784057617188GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.338 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.338 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.525 2 DEBUG nova.compute.manager [req-b33a5ebd-3294-4736-b5a6-e20796589b5c req-998fa829-0f05-41b3-95f5-6fbf020cdfdb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.526 2 DEBUG oslo_concurrency.lockutils [req-b33a5ebd-3294-4736-b5a6-e20796589b5c req-998fa829-0f05-41b3-95f5-6fbf020cdfdb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.526 2 DEBUG oslo_concurrency.lockutils [req-b33a5ebd-3294-4736-b5a6-e20796589b5c req-998fa829-0f05-41b3-95f5-6fbf020cdfdb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.526 2 DEBUG oslo_concurrency.lockutils [req-b33a5ebd-3294-4736-b5a6-e20796589b5c req-998fa829-0f05-41b3-95f5-6fbf020cdfdb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.527 2 DEBUG nova.compute.manager [req-b33a5ebd-3294-4736-b5a6-e20796589b5c req-998fa829-0f05-41b3-95f5-6fbf020cdfdb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Processing event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.539 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.539 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.539 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:28:46 compute-0 podman[230293]: 2025-09-30 21:28:46.57184702 +0000 UTC m=+0.048441505 container create 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:28:46 compute-0 systemd[1]: Started libpod-conmon-0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31.scope.
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.600 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.616 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:46 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:28:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c1e721d1c7e83be48a4463fa2423af7b88a121c7c5b30bf1b18b8e14aa032ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:28:46 compute-0 podman[230293]: 2025-09-30 21:28:46.54533235 +0000 UTC m=+0.021926845 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:28:46 compute-0 podman[230293]: 2025-09-30 21:28:46.645876251 +0000 UTC m=+0.122470756 container init 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.655 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.655 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:46 compute-0 podman[230293]: 2025-09-30 21:28:46.656311295 +0000 UTC m=+0.132905780 container start 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:28:46 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [NOTICE]   (230329) : New worker (230331) forked
Sep 30 21:28:46 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [NOTICE]   (230329) : Loading success.
Sep 30 21:28:46 compute-0 podman[230304]: 2025-09-30 21:28:46.684531038 +0000 UTC m=+0.081658145 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.850 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267726.8503647, ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.852 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] VM Started (Lifecycle Event)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.855 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.861 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.865 2 INFO nova.virt.libvirt.driver [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Instance spawned successfully.
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.865 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.878 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.883 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.901 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.902 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.903 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.904 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.905 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.906 2 DEBUG nova.virt.libvirt.driver [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.912 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.913 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267726.8505962, ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.913 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] VM Paused (Lifecycle Event)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.939 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.944 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267726.8591816, ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.944 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] VM Resumed (Lifecycle Event)
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.971 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.975 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.992 2 INFO nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Took 7.34 seconds to spawn the instance on the hypervisor.
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.993 2 DEBUG nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:46 compute-0 nova_compute[192810]: 2025-09-30 21:28:46.997 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:28:47 compute-0 nova_compute[192810]: 2025-09-30 21:28:47.090 2 INFO nova.compute.manager [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Took 7.94 seconds to build instance.
Sep 30 21:28:47 compute-0 nova_compute[192810]: 2025-09-30 21:28:47.143 2 DEBUG oslo_concurrency.lockutils [None req-999ea9ab-83e0-4a1b-8162-7ed3e02e0de1 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:47 compute-0 nova_compute[192810]: 2025-09-30 21:28:47.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.084 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "interface-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.085 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "interface-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.086 2 DEBUG nova.objects.instance [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lazy-loading 'flavor' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.132 2 DEBUG nova.objects.instance [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lazy-loading 'pci_requests' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.145 2 DEBUG nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.651 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.651 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.784 2 DEBUG nova.network.neutron [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updated VIF entry in instance network info cache for port 12eb0d00-3d96-4777-8dc9-ba184484c7f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.785 2 DEBUG nova.network.neutron [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.814 2 DEBUG oslo_concurrency.lockutils [req-53f2ee63-97a9-4a37-991b-e8c81ac59d27 req-1019a0dc-1297-46b8-b6af-954472cde822 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.887 2 DEBUG nova.compute.manager [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.887 2 DEBUG oslo_concurrency.lockutils [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.888 2 DEBUG oslo_concurrency.lockutils [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.888 2 DEBUG oslo_concurrency.lockutils [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.888 2 DEBUG nova.compute.manager [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:48 compute-0 nova_compute[192810]: 2025-09-30 21:28:48.888 2 WARNING nova.compute.manager [req-ae8467d0-118f-4e9c-a5b6-fcb5977477c6 req-e21acd8a-b000-4ef5-b481-8a6d1f824497 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received unexpected event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 for instance with vm_state active and task_state None.
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.087 2 DEBUG nova.policy [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5b7524018e934e07b3b6e81fdcf852aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9e09247c56a847ac8eb9b104a34420f0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.107 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.108 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.108 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.109 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.113 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267714.1114457, 3e2a9ae7-e454-4121-905d-4472ddfc2410 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.113 2 INFO nova.compute.manager [-] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] VM Stopped (Lifecycle Event)
Sep 30 21:28:49 compute-0 nova_compute[192810]: 2025-09-30 21:28:49.140 2 DEBUG nova.compute.manager [None req-be059eb8-1d71-4677-bc4c-994b7cbc7b5e - - - - - -] [instance: 3e2a9ae7-e454-4121-905d-4472ddfc2410] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:28:50 compute-0 nova_compute[192810]: 2025-09-30 21:28:50.258 2 DEBUG nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Successfully created port: 4690a3a8-44b4-4d33-8d58-e739294549bd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:28:50 compute-0 sshd-session[230341]: Invalid user smb from 45.81.23.80 port 37182
Sep 30 21:28:50 compute-0 sshd-session[230341]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:28:50 compute-0 sshd-session[230341]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.234 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.257 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.257 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.257 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.620 2 DEBUG nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Successfully updated port: 4690a3a8-44b4-4d33-8d58-e739294549bd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.636 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.636 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquired lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.636 2 DEBUG nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:28:51 compute-0 nova_compute[192810]: 2025-09-30 21:28:51.924 2 WARNING nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 already exists in list: networks containing: ['0d9a0e7d-4f10-48ac-8e5d-13aaaef30422']. ignoring it
Sep 30 21:28:51 compute-0 sshd-session[230341]: Failed password for invalid user smb from 45.81.23.80 port 37182 ssh2
Sep 30 21:28:52 compute-0 nova_compute[192810]: 2025-09-30 21:28:52.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:52 compute-0 sshd-session[230341]: Received disconnect from 45.81.23.80 port 37182:11: Bye Bye [preauth]
Sep 30 21:28:52 compute-0 sshd-session[230341]: Disconnected from invalid user smb 45.81.23.80 port 37182 [preauth]
Sep 30 21:28:53 compute-0 nova_compute[192810]: 2025-09-30 21:28:53.637 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-changed-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:53 compute-0 nova_compute[192810]: 2025-09-30 21:28:53.638 2 DEBUG nova.compute.manager [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Refreshing instance network info cache due to event network-changed-4690a3a8-44b4-4d33-8d58-e739294549bd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:28:53 compute-0 nova_compute[192810]: 2025-09-30 21:28:53.638 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:28:53 compute-0 nova_compute[192810]: 2025-09-30 21:28:53.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.546 2 DEBUG nova.network.neutron [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.569 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Releasing lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.570 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.570 2 DEBUG nova.network.neutron [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Refreshing network info cache for port 4690a3a8-44b4-4d33-8d58-e739294549bd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.573 2 DEBUG nova.virt.libvirt.vif [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:47Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.574 2 DEBUG nova.network.os_vif_util [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.574 2 DEBUG nova.network.os_vif_util [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.575 2 DEBUG os_vif [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.576 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.576 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.579 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4690a3a8-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.580 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4690a3a8-44, col_values=(('external_ids', {'iface-id': '4690a3a8-44b4-4d33-8d58-e739294549bd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2c:d9:32', 'vm-uuid': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 NetworkManager[51733]: <info>  [1759267734.5826] manager: (tap4690a3a8-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.593 2 INFO os_vif [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44')
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.594 2 DEBUG nova.virt.libvirt.vif [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:47Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.594 2 DEBUG nova.network.os_vif_util [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.594 2 DEBUG nova.network.os_vif_util [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.597 2 DEBUG nova.virt.libvirt.guest [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] attach device xml: <interface type="ethernet">
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:2c:d9:32"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <target dev="tap4690a3a8-44"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]: </interface>
Sep 30 21:28:54 compute-0 nova_compute[192810]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Sep 30 21:28:54 compute-0 NetworkManager[51733]: <info>  [1759267734.6080] manager: (tap4690a3a8-44): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Sep 30 21:28:54 compute-0 kernel: tap4690a3a8-44: entered promiscuous mode
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 ovn_controller[94912]: 2025-09-30T21:28:54Z|00253|binding|INFO|Claiming lport 4690a3a8-44b4-4d33-8d58-e739294549bd for this chassis.
Sep 30 21:28:54 compute-0 ovn_controller[94912]: 2025-09-30T21:28:54Z|00254|binding|INFO|4690a3a8-44b4-4d33-8d58-e739294549bd: Claiming fa:16:3e:2c:d9:32 10.100.0.3
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.616 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:d9:32 10.100.0.3'], port_security=['fa:16:3e:2c:d9:32 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e09247c56a847ac8eb9b104a34420f0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c058edda-1206-4967-a260-ebab94597051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3657ff2-06aa-4f19-a9a9-2102b82bcb72, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=4690a3a8-44b4-4d33-8d58-e739294549bd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.617 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 4690a3a8-44b4-4d33-8d58-e739294549bd in datapath 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 bound to our chassis
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.619 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422
Sep 30 21:28:54 compute-0 ovn_controller[94912]: 2025-09-30T21:28:54Z|00255|binding|INFO|Setting lport 4690a3a8-44b4-4d33-8d58-e739294549bd ovn-installed in OVS
Sep 30 21:28:54 compute-0 ovn_controller[94912]: 2025-09-30T21:28:54Z|00256|binding|INFO|Setting lport 4690a3a8-44b4-4d33-8d58-e739294549bd up in Southbound
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.635 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[99897194-c37c-419b-acce-8e22b0060471]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 systemd-udevd[230370]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.679 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6f18f5b8-f11d-43dc-bade-a7f84d626df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 NetworkManager[51733]: <info>  [1759267734.6829] device (tap4690a3a8-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:28:54 compute-0 NetworkManager[51733]: <info>  [1759267734.6841] device (tap4690a3a8-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.684 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[83042079-2e2e-4df8-b46f-f2100d9eb7f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.703 2 DEBUG nova.virt.libvirt.driver [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.704 2 DEBUG nova.virt.libvirt.driver [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.704 2 DEBUG nova.virt.libvirt.driver [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No VIF found with MAC fa:16:3e:d1:8c:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.704 2 DEBUG nova.virt.libvirt.driver [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] No VIF found with MAC fa:16:3e:2c:d9:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.716 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bd6c4bc3-625c-4584-b760-2cec97aa99a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 podman[230348]: 2025-09-30 21:28:54.727568941 +0000 UTC m=+0.079349276 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.729 2 DEBUG nova.virt.libvirt.guest [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:name>tempest-AttachInterfacesV270Test-server-1621545046</nova:name>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:creationTime>2025-09-30 21:28:54</nova:creationTime>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:flavor name="m1.nano">
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:memory>128</nova:memory>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:disk>1</nova:disk>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:swap>0</nova:swap>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:vcpus>1</nova:vcpus>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   </nova:flavor>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:owner>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:user uuid="5b7524018e934e07b3b6e81fdcf852aa">tempest-AttachInterfacesV270Test-1546994059-project-member</nova:user>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:project uuid="9e09247c56a847ac8eb9b104a34420f0">tempest-AttachInterfacesV270Test-1546994059</nova:project>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   </nova:owner>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   <nova:ports>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:port uuid="12eb0d00-3d96-4777-8dc9-ba184484c7f3">
Sep 30 21:28:54 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     <nova:port uuid="4690a3a8-44b4-4d33-8d58-e739294549bd">
Sep 30 21:28:54 compute-0 nova_compute[192810]:       <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:28:54 compute-0 nova_compute[192810]:     </nova:port>
Sep 30 21:28:54 compute-0 nova_compute[192810]:   </nova:ports>
Sep 30 21:28:54 compute-0 nova_compute[192810]: </nova:instance>
Sep 30 21:28:54 compute-0 nova_compute[192810]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.738 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b56ca2fc-a3b7-45fb-a6c4-54c0222c5911]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d9a0e7d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:b2:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442167, 'reachable_time': 28634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230400, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 podman[230351]: 2025-09-30 21:28:54.75245367 +0000 UTC m=+0.102045370 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.755 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[00cfabac-5c47-4b72-9659-37e51f291aa5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0d9a0e7d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442177, 'tstamp': 442177}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230401, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0d9a0e7d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442179, 'tstamp': 442179}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230401, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.757 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d9a0e7d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.761 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d9a0e7d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.762 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.762 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d9a0e7d-40, col_values=(('external_ids', {'iface-id': 'beda616e-dd79-43fd-ab3d-456c618879b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:54.763 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:54 compute-0 nova_compute[192810]: 2025-09-30 21:28:54.765 2 DEBUG oslo_concurrency.lockutils [None req-87656117-314e-4003-8e06-80ac03754ef0 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "interface-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.326 2 DEBUG nova.compute.manager [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.327 2 DEBUG oslo_concurrency.lockutils [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.327 2 DEBUG oslo_concurrency.lockutils [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.327 2 DEBUG oslo_concurrency.lockutils [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.327 2 DEBUG nova.compute.manager [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.328 2 WARNING nova.compute.manager [req-2e53aa6b-5575-4e83-ad19-cce73b62feec req-e17b4804-5551-4ea7-b815-6c7c96b3cd29 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received unexpected event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd for instance with vm_state active and task_state None.
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.756 2 DEBUG nova.network.neutron [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updated VIF entry in instance network info cache for port 4690a3a8-44b4-4d33-8d58-e739294549bd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.757 2 DEBUG nova.network.neutron [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [{"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:56 compute-0 nova_compute[192810]: 2025-09-30 21:28:56.770 2 DEBUG oslo_concurrency.lockutils [req-cade4669-efbd-4a00-b3a3-66fb8c7d4b5f req-e919b523-5299-40d1-b692-cc1aa748e993 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.202 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.202 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.203 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.203 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.203 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.213 2 INFO nova.compute.manager [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Terminating instance
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.221 2 DEBUG nova.compute.manager [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:28:57 compute-0 kernel: tap12eb0d00-3d (unregistering): left promiscuous mode
Sep 30 21:28:57 compute-0 NetworkManager[51733]: <info>  [1759267737.2466] device (tap12eb0d00-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00257|binding|INFO|Releasing lport 12eb0d00-3d96-4777-8dc9-ba184484c7f3 from this chassis (sb_readonly=0)
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00258|binding|INFO|Setting lport 12eb0d00-3d96-4777-8dc9-ba184484c7f3 down in Southbound
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00259|binding|INFO|Removing iface tap12eb0d00-3d ovn-installed in OVS
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.261 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:8c:fd 10.100.0.7'], port_security=['fa:16:3e:d1:8c:fd 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e09247c56a847ac8eb9b104a34420f0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c058edda-1206-4967-a260-ebab94597051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3657ff2-06aa-4f19-a9a9-2102b82bcb72, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=12eb0d00-3d96-4777-8dc9-ba184484c7f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.262 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 12eb0d00-3d96-4777-8dc9-ba184484c7f3 in datapath 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 unbound from our chassis
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.263 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.281 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7d0cd1c6-18c7-423e-9fa0-ad5c0ffa3896]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 kernel: tap4690a3a8-44 (unregistering): left promiscuous mode
Sep 30 21:28:57 compute-0 NetworkManager[51733]: <info>  [1759267737.2887] device (tap4690a3a8-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00260|binding|INFO|Releasing lport 4690a3a8-44b4-4d33-8d58-e739294549bd from this chassis (sb_readonly=0)
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00261|binding|INFO|Setting lport 4690a3a8-44b4-4d33-8d58-e739294549bd down in Southbound
Sep 30 21:28:57 compute-0 ovn_controller[94912]: 2025-09-30T21:28:57Z|00262|binding|INFO|Removing iface tap4690a3a8-44 ovn-installed in OVS
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.306 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:d9:32 10.100.0.3'], port_security=['fa:16:3e:2c:d9:32 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9e09247c56a847ac8eb9b104a34420f0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c058edda-1206-4967-a260-ebab94597051', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f3657ff2-06aa-4f19-a9a9-2102b82bcb72, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=4690a3a8-44b4-4d33-8d58-e739294549bd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.317 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3248624c-31ea-4807-b937-ecd6804c913a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.320 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3a7c1895-fb11-4c96-ac3d-91d80d77d15e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.349 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5ff686-3e90-4028-9f1b-cad5c8ac5693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000046.scope: Deactivated successfully.
Sep 30 21:28:57 compute-0 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000046.scope: Consumed 11.164s CPU time.
Sep 30 21:28:57 compute-0 systemd-machined[152794]: Machine qemu-33-instance-00000046 terminated.
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.364 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4de98fb2-2dc4-437e-b043-bf4b92445060]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d9a0e7d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:b2:92'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 832, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 8, 'rx_bytes': 832, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442167, 'reachable_time': 28634, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230416, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.376 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7ddd2d-7c45-4e20-b8a5-a2cfc575a3df]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0d9a0e7d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442177, 'tstamp': 442177}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230417, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0d9a0e7d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 442179, 'tstamp': 442179}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230417, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.378 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d9a0e7d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.385 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d9a0e7d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.387 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.387 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d9a0e7d-40, col_values=(('external_ids', {'iface-id': 'beda616e-dd79-43fd-ab3d-456c618879b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.388 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.389 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 4690a3a8-44b4-4d33-8d58-e739294549bd in datapath 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 unbound from our chassis
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.390 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.391 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a5278928-1f50-43e9-b556-7f40597f9bb3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.391 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 namespace which is not needed anymore
Sep 30 21:28:57 compute-0 NetworkManager[51733]: <info>  [1759267737.4517] manager: (tap4690a3a8-44): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.494 2 INFO nova.virt.libvirt.driver [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Instance destroyed successfully.
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.495 2 DEBUG nova.objects.instance [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lazy-loading 'resources' on Instance uuid ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.511 2 DEBUG nova.virt.libvirt.vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:47Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.511 2 DEBUG nova.network.os_vif_util [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "address": "fa:16:3e:d1:8c:fd", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap12eb0d00-3d", "ovs_interfaceid": "12eb0d00-3d96-4777-8dc9-ba184484c7f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.512 2 DEBUG nova.network.os_vif_util [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.512 2 DEBUG os_vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap12eb0d00-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [NOTICE]   (230329) : haproxy version is 2.8.14-c23fe91
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [NOTICE]   (230329) : path to executable is /usr/sbin/haproxy
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [WARNING]  (230329) : Exiting Master process...
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [WARNING]  (230329) : Exiting Master process...
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [ALERT]    (230329) : Current worker (230331) exited with code 143 (Terminated)
Sep 30 21:28:57 compute-0 neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422[230307]: [WARNING]  (230329) : All workers exited. Exiting... (0)
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 systemd[1]: libpod-0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31.scope: Deactivated successfully.
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.521 2 INFO os_vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:8c:fd,bridge_name='br-int',has_traffic_filtering=True,id=12eb0d00-3d96-4777-8dc9-ba184484c7f3,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap12eb0d00-3d')
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.522 2 DEBUG nova.virt.libvirt.vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:28:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachInterfacesV270Test-server-1621545046',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesv270test-server-1621545046',id=70,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:28:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9e09247c56a847ac8eb9b104a34420f0',ramdisk_id='',reservation_id='r-g0khdglp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesV270Test-1546994059',owner_user_name='tempest-AttachInterfacesV270Test-1546994059-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:28:47Z,user_data=None,user_id='5b7524018e934e07b3b6e81fdcf852aa',uuid=ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.522 2 DEBUG nova.network.os_vif_util [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converting VIF {"id": "4690a3a8-44b4-4d33-8d58-e739294549bd", "address": "fa:16:3e:2c:d9:32", "network": {"id": "0d9a0e7d-4f10-48ac-8e5d-13aaaef30422", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-77895882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9e09247c56a847ac8eb9b104a34420f0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4690a3a8-44", "ovs_interfaceid": "4690a3a8-44b4-4d33-8d58-e739294549bd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.523 2 DEBUG nova.network.os_vif_util [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.523 2 DEBUG os_vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.525 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4690a3a8-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.528 2 INFO os_vif [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2c:d9:32,bridge_name='br-int',has_traffic_filtering=True,id=4690a3a8-44b4-4d33-8d58-e739294549bd,network=Network(0d9a0e7d-4f10-48ac-8e5d-13aaaef30422),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4690a3a8-44')
Sep 30 21:28:57 compute-0 podman[230460]: 2025-09-30 21:28:57.528985637 +0000 UTC m=+0.043439389 container died 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.529 2 INFO nova.virt.libvirt.driver [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Deleting instance files /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5_del
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.529 2 INFO nova.virt.libvirt.driver [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Deletion of /var/lib/nova/instances/ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5_del complete
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.541 2 DEBUG nova.compute.manager [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-unplugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.542 2 DEBUG oslo_concurrency.lockutils [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.542 2 DEBUG oslo_concurrency.lockutils [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.542 2 DEBUG oslo_concurrency.lockutils [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.543 2 DEBUG nova.compute.manager [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-unplugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.543 2 DEBUG nova.compute.manager [req-c00cc526-3638-4b96-a1ef-579b18e48b74 req-f70b6ad9-9071-487f-af69-ad2f16ba6571 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-unplugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:28:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31-userdata-shm.mount: Deactivated successfully.
Sep 30 21:28:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c1e721d1c7e83be48a4463fa2423af7b88a121c7c5b30bf1b18b8e14aa032ac-merged.mount: Deactivated successfully.
Sep 30 21:28:57 compute-0 podman[230460]: 2025-09-30 21:28:57.571158663 +0000 UTC m=+0.085612415 container cleanup 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:28:57 compute-0 systemd[1]: libpod-conmon-0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31.scope: Deactivated successfully.
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.602 2 INFO nova.compute.manager [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.603 2 DEBUG oslo.service.loopingcall [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.604 2 DEBUG nova.compute.manager [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.604 2 DEBUG nova.network.neutron [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:28:57 compute-0 podman[230494]: 2025-09-30 21:28:57.631485727 +0000 UTC m=+0.041551351 container remove 0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.635 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2c8a1b-8a66-45c6-be83-8dfcdb121d57]: (4, ('Tue Sep 30 09:28:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 (0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31)\n0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31\nTue Sep 30 09:28:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 (0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31)\n0f8572c07c0c050749574f1d8ae656e22efe8d8cc7a0b1861f18d32c43c60c31\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.637 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[54e33c01-7afa-46cf-9a8e-d89257fa75a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.638 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d9a0e7d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:28:57 compute-0 kernel: tap0d9a0e7d-40: left promiscuous mode
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 nova_compute[192810]: 2025-09-30 21:28:57.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.653 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac38bac-0f65-48c5-9a49-70cba7646437]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.680 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a339344f-6d04-411f-9949-f71f6e1d78e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.683 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ac58e6e2-2284-4493-8173-3f8891f5071a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.699 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[12be816c-8661-4558-a023-5266147ed4f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 442160, 'reachable_time': 40514, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230510, 'error': None, 'target': 'ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.701 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0d9a0e7d-4f10-48ac-8e5d-13aaaef30422 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:28:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:28:57.701 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b8844774-ba61-442c-83e4-2e3399319d7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:28:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d0d9a0e7d\x2d4f10\x2d48ac\x2d8e5d\x2d13aaaef30422.mount: Deactivated successfully.
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.490 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.491 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.491 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.491 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.491 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.492 2 WARNING nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received unexpected event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd for instance with vm_state active and task_state deleting.
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.492 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-unplugged-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.492 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.492 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.492 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.493 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-unplugged-4690a3a8-44b4-4d33-8d58-e739294549bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.493 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-unplugged-4690a3a8-44b4-4d33-8d58-e739294549bd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.493 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.493 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.493 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.494 2 DEBUG oslo_concurrency.lockutils [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.494 2 DEBUG nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.494 2 WARNING nova.compute.manager [req-04120feb-63c3-4076-a60c-d1e337b89c6d req-7aa8e431-810b-40a1-9a21-27733c6a33a0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received unexpected event network-vif-plugged-4690a3a8-44b4-4d33-8d58-e739294549bd for instance with vm_state active and task_state deleting.
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.877 2 DEBUG nova.network.neutron [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.894 2 INFO nova.compute.manager [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Took 1.29 seconds to deallocate network for instance.
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.993 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:58 compute-0 nova_compute[192810]: 2025-09-30 21:28:58.993 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.042 2 DEBUG nova.compute.provider_tree [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.054 2 DEBUG nova.scheduler.client.report [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.070 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.092 2 INFO nova.scheduler.client.report [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Deleted allocations for instance ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.158 2 DEBUG oslo_concurrency.lockutils [None req-83daafae-47c7-4057-8593-d0ee129d967f 5b7524018e934e07b3b6e81fdcf852aa 9e09247c56a847ac8eb9b104a34420f0 - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.658 2 DEBUG nova.compute.manager [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.659 2 DEBUG oslo_concurrency.lockutils [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.659 2 DEBUG oslo_concurrency.lockutils [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.660 2 DEBUG oslo_concurrency.lockutils [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.660 2 DEBUG nova.compute.manager [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] No waiting events found dispatching network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:28:59 compute-0 nova_compute[192810]: 2025-09-30 21:28:59.660 2 WARNING nova.compute.manager [req-d2beace6-2f8b-4186-a0ad-aca6ca07284f req-892ace07-47d6-4e20-a2c3-e2afe2b7a790 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received unexpected event network-vif-plugged-12eb0d00-3d96-4777-8dc9-ba184484c7f3 for instance with vm_state deleted and task_state None.
Sep 30 21:29:00 compute-0 nova_compute[192810]: 2025-09-30 21:29:00.042 2 DEBUG nova.compute.manager [req-90311d4c-6655-4643-a06c-db218a7284ec req-3802fcbc-575b-4c3c-ac20-eae1c21e894d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-deleted-12eb0d00-3d96-4777-8dc9-ba184484c7f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:00 compute-0 nova_compute[192810]: 2025-09-30 21:29:00.043 2 DEBUG nova.compute.manager [req-90311d4c-6655-4643-a06c-db218a7284ec req-3802fcbc-575b-4c3c-ac20-eae1c21e894d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Received event network-vif-deleted-4690a3a8-44b4-4d33-8d58-e739294549bd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:01 compute-0 podman[230514]: 2025-09-30 21:29:01.328134328 +0000 UTC m=+0.053169615 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:29:01 compute-0 podman[230513]: 2025-09-30 21:29:01.334691984 +0000 UTC m=+0.067584660 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:29:01 compute-0 podman[230512]: 2025-09-30 21:29:01.35751475 +0000 UTC m=+0.085364488 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:29:02 compute-0 nova_compute[192810]: 2025-09-30 21:29:02.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:02 compute-0 nova_compute[192810]: 2025-09-30 21:29:02.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:03 compute-0 nova_compute[192810]: 2025-09-30 21:29:03.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:07 compute-0 nova_compute[192810]: 2025-09-30 21:29:07.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:07 compute-0 nova_compute[192810]: 2025-09-30 21:29:07.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.243 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.243 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.256 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.345 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.345 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.351 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.352 2 INFO nova.compute.claims [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.476 2 DEBUG nova.compute.provider_tree [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.487 2 DEBUG nova.scheduler.client.report [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.504 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.505 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.570 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.570 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.587 2 INFO nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.602 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.779 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.781 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.781 2 INFO nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Creating image(s)
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.782 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.782 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.783 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.794 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.849 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.850 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.851 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.876 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.971 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:08 compute-0 nova_compute[192810]: 2025-09-30 21:29:08.973 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.028 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.030 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.031 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.090 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.091 2 DEBUG nova.virt.disk.api [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.092 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.149 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.149 2 DEBUG nova.virt.disk.api [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.150 2 DEBUG nova.objects.instance [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 3d704b04-4399-45ea-bc89-6cf74b4dec72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.179 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.180 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Ensure instance console log exists: /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.180 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.181 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.182 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:09 compute-0 nova_compute[192810]: 2025-09-30 21:29:09.619 2 DEBUG nova.policy [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.401 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Successfully created port: 69480967-0a46-4a33-95ec-9633390ec042 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.493 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267737.4922059, ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.493 2 INFO nova.compute.manager [-] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] VM Stopped (Lifecycle Event)
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.510 2 DEBUG nova.compute.manager [None req-80b951ab-9467-4d73-9006-3af092178530 - - - - - -] [instance: ab1f4ef2-232c-4bf9-81d4-0d3c2172c7e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:12 compute-0 nova_compute[192810]: 2025-09-30 21:29:12.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:13 compute-0 podman[230591]: 2025-09-30 21:29:13.336766929 +0000 UTC m=+0.076811752 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:13 compute-0 podman[230592]: 2025-09-30 21:29:13.342506104 +0000 UTC m=+0.080942976 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:13 compute-0 nova_compute[192810]: 2025-09-30 21:29:13.610 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Successfully updated port: 69480967-0a46-4a33-95ec-9633390ec042 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:29:13 compute-0 nova_compute[192810]: 2025-09-30 21:29:13.630 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:13 compute-0 nova_compute[192810]: 2025-09-30 21:29:13.630 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:13 compute-0 nova_compute[192810]: 2025-09-30 21:29:13.630 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:29:14 compute-0 nova_compute[192810]: 2025-09-30 21:29:14.026 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:29:15 compute-0 nova_compute[192810]: 2025-09-30 21:29:15.227 2 DEBUG nova.compute.manager [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-changed-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:15 compute-0 nova_compute[192810]: 2025-09-30 21:29:15.227 2 DEBUG nova.compute.manager [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing instance network info cache due to event network-changed-69480967-0a46-4a33-95ec-9633390ec042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:15 compute-0 nova_compute[192810]: 2025-09-30 21:29:15.227 2 DEBUG oslo_concurrency.lockutils [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.748 2 DEBUG nova.network.neutron [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.770 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.771 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance network_info: |[{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.771 2 DEBUG oslo_concurrency.lockutils [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.772 2 DEBUG nova.network.neutron [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing network info cache for port 69480967-0a46-4a33-95ec-9633390ec042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.777 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start _get_guest_xml network_info=[{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.784 2 WARNING nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.795 2 DEBUG nova.virt.libvirt.host [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.796 2 DEBUG nova.virt.libvirt.host [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.802 2 DEBUG nova.virt.libvirt.host [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.803 2 DEBUG nova.virt.libvirt.host [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.804 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.805 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.806 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.806 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.806 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.807 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.807 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.808 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.808 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.809 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.809 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.810 2 DEBUG nova.virt.hardware [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.815 2 DEBUG nova.virt.libvirt.vif [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.816 2 DEBUG nova.network.os_vif_util [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.817 2 DEBUG nova.network.os_vif_util [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.819 2 DEBUG nova.objects.instance [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3d704b04-4399-45ea-bc89-6cf74b4dec72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.833 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <uuid>3d704b04-4399-45ea-bc89-6cf74b4dec72</uuid>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <name>instance-00000049</name>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestJSON-server-22148059</nova:name>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:29:16</nova:creationTime>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         <nova:port uuid="69480967-0a46-4a33-95ec-9633390ec042">
Sep 30 21:29:16 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <system>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="serial">3d704b04-4399-45ea-bc89-6cf74b4dec72</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="uuid">3d704b04-4399-45ea-bc89-6cf74b4dec72</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </system>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <os>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </os>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <features>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </features>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:d1:d8:eb"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <target dev="tap69480967-0a"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/console.log" append="off"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <video>
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </video>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:29:16 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:29:16 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:29:16 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:29:16 compute-0 nova_compute[192810]: </domain>
Sep 30 21:29:16 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.836 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Preparing to wait for external event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.836 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.836 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.837 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.837 2 DEBUG nova.virt.libvirt.vif [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.838 2 DEBUG nova.network.os_vif_util [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.838 2 DEBUG nova.network.os_vif_util [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.839 2 DEBUG os_vif [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.844 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69480967-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69480967-0a, col_values=(('external_ids', {'iface-id': '69480967-0a46-4a33-95ec-9633390ec042', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:d8:eb', 'vm-uuid': '3d704b04-4399-45ea-bc89-6cf74b4dec72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:16 compute-0 NetworkManager[51733]: <info>  [1759267756.8474] manager: (tap69480967-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.856 2 INFO os_vif [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a')
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.907 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.908 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.908 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:d1:d8:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:29:16 compute-0 nova_compute[192810]: 2025-09-30 21:29:16.908 2 INFO nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Using config drive
Sep 30 21:29:16 compute-0 podman[230641]: 2025-09-30 21:29:16.955204724 +0000 UTC m=+0.057392362 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.317 2 INFO nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Creating config drive at /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.322 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy_6ick0r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.447 2 DEBUG oslo_concurrency.processutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy_6ick0r" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:17 compute-0 kernel: tap69480967-0a: entered promiscuous mode
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.5230] manager: (tap69480967-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Sep 30 21:29:17 compute-0 ovn_controller[94912]: 2025-09-30T21:29:17Z|00263|binding|INFO|Claiming lport 69480967-0a46-4a33-95ec-9633390ec042 for this chassis.
Sep 30 21:29:17 compute-0 ovn_controller[94912]: 2025-09-30T21:29:17Z|00264|binding|INFO|69480967-0a46-4a33-95ec-9633390ec042: Claiming fa:16:3e:d1:d8:eb 10.100.0.14
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.539 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d8:eb 10.100.0.14'], port_security=['fa:16:3e:d1:d8:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3d704b04-4399-45ea-bc89-6cf74b4dec72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=69480967-0a46-4a33-95ec-9633390ec042) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.541 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 69480967-0a46-4a33-95ec-9633390ec042 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.543 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:29:17 compute-0 systemd-udevd[230675]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.558 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce89770-68fc-4c29-81c4-32b795df9482]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.560 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.561 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.561 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[21c01c51-7a94-4845-8fa0-0c04307cee14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.563 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec3473a-0532-4702-a272-c52dca17f9d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 systemd-machined[152794]: New machine qemu-34-instance-00000049.
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.5747] device (tap69480967-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.5755] device (tap69480967-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:29:17 compute-0 systemd[1]: Started Virtual Machine qemu-34-instance-00000049.
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.576 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[f91e5874-f5f6-4299-8b8d-da402a89ff74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_controller[94912]: 2025-09-30T21:29:17Z|00265|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 ovn-installed in OVS
Sep 30 21:29:17 compute-0 ovn_controller[94912]: 2025-09-30T21:29:17Z|00266|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 up in Southbound
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.604 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dadc070b-8100-4933-89e0-0244465c2c3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.636 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[eef6a85e-8588-4187-b886-f1a35d870cba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.642 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[028c65d3-ecee-4cc9-a9cb-693fa3b96e1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.6430] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Sep 30 21:29:17 compute-0 systemd-udevd[230679]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.679 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[234fca7a-5391-4c60-b260-032f1f8cfa64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.682 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[618f5ae1-6a83-4a0f-9eae-a6575a7e9a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.7062] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.712 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d7996695-8319-47d4-be88-79c8d82bb302]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.731 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3230be22-0d3d-441e-b30a-1c20339ef4df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445333, 'reachable_time': 28712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230708, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.746 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7849d23a-b4ec-4ced-b227-72ea57a063cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445333, 'tstamp': 445333}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230710, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.761 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[64049186-a4e2-4042-9ef3-7e25bfa86f4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445333, 'reachable_time': 28712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230712, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.778 2 DEBUG nova.network.neutron [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updated VIF entry in instance network info cache for port 69480967-0a46-4a33-95ec-9633390ec042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.779 2 DEBUG nova.network.neutron [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.795 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7e578231-d26c-4aad-ba9a-6509b4cd9bb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.799 2 DEBUG oslo_concurrency.lockutils [req-5b381892-a7e0-4a4d-b9fa-30be0e92b03c req-aac27c43-e366-4289-9c30-207b0309dcd7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7cc9a1-6022-4270-886e-1670d5ac6b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.857 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.857 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.858 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 NetworkManager[51733]: <info>  [1759267757.8601] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Sep 30 21:29:17 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.862 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 ovn_controller[94912]: 2025-09-30T21:29:17Z|00267|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.878 2 DEBUG nova.compute.manager [req-b05ccd22-eb3a-4bdf-a3e4-afae738b613c req-13bbac07-b24e-4504-a3aa-3273e9581177 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.878 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.879 2 DEBUG oslo_concurrency.lockutils [req-b05ccd22-eb3a-4bdf-a3e4-afae738b613c req-13bbac07-b24e-4504-a3aa-3273e9581177 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.879 2 DEBUG oslo_concurrency.lockutils [req-b05ccd22-eb3a-4bdf-a3e4-afae738b613c req-13bbac07-b24e-4504-a3aa-3273e9581177 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.879 2 DEBUG oslo_concurrency.lockutils [req-b05ccd22-eb3a-4bdf-a3e4-afae738b613c req-13bbac07-b24e-4504-a3aa-3273e9581177 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.879 2 DEBUG nova.compute.manager [req-b05ccd22-eb3a-4bdf-a3e4-afae738b613c req-13bbac07-b24e-4504-a3aa-3273e9581177 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Processing event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:29:17 compute-0 nova_compute[192810]: 2025-09-30 21:29:17.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.880 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4a6202-6eb8-4349-834f-742937ad758e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.880 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:29:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:17.881 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.196 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267758.196564, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.197 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Started (Lifecycle Event)
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.199 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.202 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.205 2 INFO nova.virt.libvirt.driver [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance spawned successfully.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.206 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.217 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.222 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.227 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.227 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.228 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.228 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.229 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.229 2 DEBUG nova.virt.libvirt.driver [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.239 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.239 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267758.197303, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.240 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Paused (Lifecycle Event)
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.260 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.263 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267758.2021565, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.263 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Resumed (Lifecycle Event)
Sep 30 21:29:18 compute-0 podman[230749]: 2025-09-30 21:29:18.268377654 +0000 UTC m=+0.061051054 container create 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.293 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.295 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:18 compute-0 systemd[1]: Started libpod-conmon-58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb.scope.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.316 2 INFO nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Took 9.54 seconds to spawn the instance on the hypervisor.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.316 2 DEBUG nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.321 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:18 compute-0 podman[230749]: 2025-09-30 21:29:18.229131862 +0000 UTC m=+0.021805282 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:29:18 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:29:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c2f8134a762b8d15a3a37f7d84c752b6011dfba38259f8f5fa12367c8248011/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:29:18 compute-0 podman[230749]: 2025-09-30 21:29:18.368141255 +0000 UTC m=+0.160814655 container init 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:29:18 compute-0 podman[230749]: 2025-09-30 21:29:18.373417949 +0000 UTC m=+0.166091349 container start 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:29:18 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [NOTICE]   (230768) : New worker (230770) forked
Sep 30 21:29:18 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [NOTICE]   (230768) : Loading success.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.409 2 INFO nova.compute.manager [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Took 10.10 seconds to build instance.
Sep 30 21:29:18 compute-0 nova_compute[192810]: 2025-09-30 21:29:18.466 2 DEBUG oslo_concurrency.lockutils [None req-fe5cf2a0-02e6-4de8-acfe-442834b4287e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.718 2 DEBUG nova.compute.manager [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.718 2 DEBUG oslo_concurrency.lockutils [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.719 2 DEBUG oslo_concurrency.lockutils [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.719 2 DEBUG oslo_concurrency.lockutils [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.719 2 DEBUG nova.compute.manager [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:20 compute-0 nova_compute[192810]: 2025-09-30 21:29:20.719 2 WARNING nova.compute.manager [req-0cfd898d-599c-4a36-9774-3f047fc396d7 req-acd95ed0-ce1f-4854-a74b-d004ad39e2fb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state active and task_state None.
Sep 30 21:29:21 compute-0 nova_compute[192810]: 2025-09-30 21:29:21.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:22 compute-0 nova_compute[192810]: 2025-09-30 21:29:22.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:22 compute-0 nova_compute[192810]: 2025-09-30 21:29:22.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:22 compute-0 NetworkManager[51733]: <info>  [1759267762.5337] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Sep 30 21:29:22 compute-0 NetworkManager[51733]: <info>  [1759267762.5350] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Sep 30 21:29:22 compute-0 nova_compute[192810]: 2025-09-30 21:29:22.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:22 compute-0 ovn_controller[94912]: 2025-09-30T21:29:22Z|00268|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:29:22 compute-0 nova_compute[192810]: 2025-09-30 21:29:22.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:24 compute-0 nova_compute[192810]: 2025-09-30 21:29:24.973 2 DEBUG nova.compute.manager [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-changed-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:24 compute-0 nova_compute[192810]: 2025-09-30 21:29:24.973 2 DEBUG nova.compute.manager [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing instance network info cache due to event network-changed-69480967-0a46-4a33-95ec-9633390ec042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:24 compute-0 nova_compute[192810]: 2025-09-30 21:29:24.974 2 DEBUG oslo_concurrency.lockutils [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:24 compute-0 nova_compute[192810]: 2025-09-30 21:29:24.974 2 DEBUG oslo_concurrency.lockutils [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:24 compute-0 nova_compute[192810]: 2025-09-30 21:29:24.974 2 DEBUG nova.network.neutron [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing network info cache for port 69480967-0a46-4a33-95ec-9633390ec042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:25 compute-0 podman[230781]: 2025-09-30 21:29:25.325840538 +0000 UTC m=+0.055731530 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:29:25 compute-0 podman[230782]: 2025-09-30 21:29:25.330775342 +0000 UTC m=+0.060881890 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:29:26 compute-0 nova_compute[192810]: 2025-09-30 21:29:26.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:27 compute-0 nova_compute[192810]: 2025-09-30 21:29:27.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:28 compute-0 nova_compute[192810]: 2025-09-30 21:29:28.071 2 DEBUG nova.network.neutron [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updated VIF entry in instance network info cache for port 69480967-0a46-4a33-95ec-9633390ec042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:28 compute-0 nova_compute[192810]: 2025-09-30 21:29:28.072 2 DEBUG nova.network.neutron [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:28 compute-0 nova_compute[192810]: 2025-09-30 21:29:28.287 2 DEBUG oslo_concurrency.lockutils [req-5143c5e6-7733-4950-be25-d33839682b8b req-3baaa7ea-6385-41dd-a45d-a67ab38693ba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.124 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.125 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.155 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.264 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.265 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.273 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.273 2 INFO nova.compute.claims [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.509 2 DEBUG nova.compute.provider_tree [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.558 2 DEBUG nova.scheduler.client.report [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.699 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.699 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.767 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.768 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.783 2 INFO nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.812 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.941 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.942 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.942 2 INFO nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Creating image(s)
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.943 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.943 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.944 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.955 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:29 compute-0 nova_compute[192810]: 2025-09-30 21:29:29.972 2 DEBUG nova.policy [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e2eec3e45b7946af8db04ebcaf028f7a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ff84dd76a06841f3bdcc13f83407b7ba', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.011 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.011 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.012 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.022 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.084 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.085 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.117 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.118 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.119 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.190 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.191 2 DEBUG nova.virt.disk.api [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Checking if we can resize image /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.192 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.268 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.269 2 DEBUG nova.virt.disk.api [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Cannot resize image /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.269 2 DEBUG nova.objects.instance [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lazy-loading 'migration_context' on Instance uuid 326f57c1-51b5-4fdb-ac31-2d754684b734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.347 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.348 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Ensure instance console log exists: /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.348 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.349 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.349 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:30.754 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:30.755 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:30 compute-0 nova_compute[192810]: 2025-09-30 21:29:30.864 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Successfully created port: 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:29:31 compute-0 ovn_controller[94912]: 2025-09-30T21:29:31Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:d8:eb 10.100.0.14
Sep 30 21:29:31 compute-0 ovn_controller[94912]: 2025-09-30T21:29:31Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:d8:eb 10.100.0.14
Sep 30 21:29:31 compute-0 nova_compute[192810]: 2025-09-30 21:29:31.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:32 compute-0 podman[230858]: 2025-09-30 21:29:32.333699868 +0000 UTC m=+0.068764559 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:29:32 compute-0 podman[230856]: 2025-09-30 21:29:32.344222684 +0000 UTC m=+0.083852691 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:29:32 compute-0 podman[230857]: 2025-09-30 21:29:32.363148952 +0000 UTC m=+0.092424397 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.619 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Successfully updated port: 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.637 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.638 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquired lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.638 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.726 2 DEBUG nova.compute.manager [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-changed-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.726 2 DEBUG nova.compute.manager [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Refreshing instance network info cache due to event network-changed-653ddbd9-e7e8-481f-a8cf-875e0ca3e139. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.727 2 DEBUG oslo_concurrency.lockutils [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:32 compute-0 nova_compute[192810]: 2025-09-30 21:29:32.809 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:29:33 compute-0 nova_compute[192810]: 2025-09-30 21:29:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:33 compute-0 nova_compute[192810]: 2025-09-30 21:29:33.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:29:33 compute-0 nova_compute[192810]: 2025-09-30 21:29:33.801 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.007 2 DEBUG nova.network.neutron [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updating instance_info_cache with network_info: [{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.049 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Releasing lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.050 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Instance network_info: |[{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.051 2 DEBUG oslo_concurrency.lockutils [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.051 2 DEBUG nova.network.neutron [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Refreshing network info cache for port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.055 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Start _get_guest_xml network_info=[{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.060 2 WARNING nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.068 2 DEBUG nova.virt.libvirt.host [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.069 2 DEBUG nova.virt.libvirt.host [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.072 2 DEBUG nova.virt.libvirt.host [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.072 2 DEBUG nova.virt.libvirt.host [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.073 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.074 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.074 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.074 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.074 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.075 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.075 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.075 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.075 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.076 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.076 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.076 2 DEBUG nova.virt.hardware [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.079 2 DEBUG nova.virt.libvirt.vif [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=75,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/1IBUkkyaU5oPysNsJ8gPw3vUlBbwFlnKGKVOWQScO88HvqBZTV0CnXUyBRjZdY1EyVymgB+KaUMUZGHsrL7tRlWYMuPLiCwJG5fIXmNt4g9/kThbknLPhlR/gWcBiVA==',key_name='tempest-keypair-1094843309',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ff84dd76a06841f3bdcc13f83407b7ba',ramdisk_id='',reservation_id='r-7cs01tb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-1444957193',owner_user_name='tempest-ServersTestFqdnHostnames-1444957193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e2eec3e45b7946af8db04ebcaf028f7a',uuid=326f57c1-51b5-4fdb-ac31-2d754684b734,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.080 2 DEBUG nova.network.os_vif_util [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converting VIF {"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.080 2 DEBUG nova.network.os_vif_util [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.081 2 DEBUG nova.objects.instance [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lazy-loading 'pci_devices' on Instance uuid 326f57c1-51b5-4fdb-ac31-2d754684b734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.096 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <uuid>326f57c1-51b5-4fdb-ac31-2d754684b734</uuid>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <name>instance-0000004b</name>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:name>guest-instance-1.domain.com</nova:name>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:29:34</nova:creationTime>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:user uuid="e2eec3e45b7946af8db04ebcaf028f7a">tempest-ServersTestFqdnHostnames-1444957193-project-member</nova:user>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:project uuid="ff84dd76a06841f3bdcc13f83407b7ba">tempest-ServersTestFqdnHostnames-1444957193</nova:project>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         <nova:port uuid="653ddbd9-e7e8-481f-a8cf-875e0ca3e139">
Sep 30 21:29:34 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <system>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="serial">326f57c1-51b5-4fdb-ac31-2d754684b734</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="uuid">326f57c1-51b5-4fdb-ac31-2d754684b734</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </system>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <os>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </os>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <features>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </features>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.config"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:4e:25:7b"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <target dev="tap653ddbd9-e7"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/console.log" append="off"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <video>
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </video>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:29:34 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:29:34 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:29:34 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:29:34 compute-0 nova_compute[192810]: </domain>
Sep 30 21:29:34 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.097 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Preparing to wait for external event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.097 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.097 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.098 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.098 2 DEBUG nova.virt.libvirt.vif [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=75,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/1IBUkkyaU5oPysNsJ8gPw3vUlBbwFlnKGKVOWQScO88HvqBZTV0CnXUyBRjZdY1EyVymgB+KaUMUZGHsrL7tRlWYMuPLiCwJG5fIXmNt4g9/kThbknLPhlR/gWcBiVA==',key_name='tempest-keypair-1094843309',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ff84dd76a06841f3bdcc13f83407b7ba',ramdisk_id='',reservation_id='r-7cs01tb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestFqdnHostnames-1444957193',owner_user_name='tempest-ServersTestFqdnHostnames-1444957193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:29:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e2eec3e45b7946af8db04ebcaf028f7a',uuid=326f57c1-51b5-4fdb-ac31-2d754684b734,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.099 2 DEBUG nova.network.os_vif_util [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converting VIF {"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.099 2 DEBUG nova.network.os_vif_util [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.100 2 DEBUG os_vif [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.101 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.103 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap653ddbd9-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.104 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap653ddbd9-e7, col_values=(('external_ids', {'iface-id': '653ddbd9-e7e8-481f-a8cf-875e0ca3e139', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:25:7b', 'vm-uuid': '326f57c1-51b5-4fdb-ac31-2d754684b734'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:29:34 compute-0 NetworkManager[51733]: <info>  [1759267774.1077] manager: (tap653ddbd9-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.114 2 INFO os_vif [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7')
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.177 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.178 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.178 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] No VIF found with MAC fa:16:3e:4e:25:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.178 2 INFO nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Using config drive
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.773 2 INFO nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Creating config drive at /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.config
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.779 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptuz7ny30 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:34 compute-0 nova_compute[192810]: 2025-09-30 21:29:34.923 2 DEBUG oslo_concurrency.processutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptuz7ny30" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:35 compute-0 kernel: tap653ddbd9-e7: entered promiscuous mode
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.0031] manager: (tap653ddbd9-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Sep 30 21:29:35 compute-0 ovn_controller[94912]: 2025-09-30T21:29:35Z|00269|binding|INFO|Claiming lport 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 for this chassis.
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 ovn_controller[94912]: 2025-09-30T21:29:35Z|00270|binding|INFO|653ddbd9-e7e8-481f-a8cf-875e0ca3e139: Claiming fa:16:3e:4e:25:7b 10.100.0.8
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.016 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:25:7b 10.100.0.8'], port_security=['fa:16:3e:4e:25:7b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '326f57c1-51b5-4fdb-ac31-2d754684b734', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ff84dd76a06841f3bdcc13f83407b7ba', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b0495b63-c487-4aa6-9461-7d07efab628b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dfab36b-33d9-41d6-bc0e-727f1f2b3d16, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=653ddbd9-e7e8-481f-a8cf-875e0ca3e139) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.017 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 in datapath 52d4dc6e-c68c-4c9f-816b-010b74312e25 bound to our chassis
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.018 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 52d4dc6e-c68c-4c9f-816b-010b74312e25
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 ovn_controller[94912]: 2025-09-30T21:29:35Z|00271|binding|INFO|Setting lport 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 ovn-installed in OVS
Sep 30 21:29:35 compute-0 ovn_controller[94912]: 2025-09-30T21:29:35Z|00272|binding|INFO|Setting lport 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 up in Southbound
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.031 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b9165ace-fae4-4519-bcd7-b95167df48b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.032 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap52d4dc6e-c1 in ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.034 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap52d4dc6e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.034 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1409581b-5f8f-4c7b-84d2-ada69520c06d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.035 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f5e9a3c7-b12e-40b5-97ab-e46d9cc579c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.046 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[66fea460-9cc9-46ea-9cb6-4a0a00ce8a9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 systemd-machined[152794]: New machine qemu-35-instance-0000004b.
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.062 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[420314b7-34e0-40d3-84ba-f1eb4fe92f7c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 systemd[1]: Started Virtual Machine qemu-35-instance-0000004b.
Sep 30 21:29:35 compute-0 systemd-udevd[230942]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.093 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[db2babcb-e08f-46d2-a2db-a39462300a15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.0967] device (tap653ddbd9-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.0978] device (tap653ddbd9-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.1014] manager: (tap52d4dc6e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.101 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cbd53e01-8fc3-4ad9-9f7e-2b306635173c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.134 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[5842563d-4b73-40d5-9e22-e62825e0ba38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.138 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a34ba532-8778-4d91-84b9-976b0b305a6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.1585] device (tap52d4dc6e-c0): carrier: link connected
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.169 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dd91b7fc-3a9d-4222-8344-4e0d158a9944]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.187 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dae75bda-87ce-4745-9241-57b278b05311]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d4dc6e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:ea:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447078, 'reachable_time': 33851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230970, 'error': None, 'target': 'ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.207 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[70186422-74cf-4de7-b584-b97477b7b2a7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:ea53'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 447078, 'tstamp': 447078}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230971, 'error': None, 'target': 'ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.224 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ee152cef-e4b6-4f78-a86d-a3008cffcd89]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap52d4dc6e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:ea:53'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447078, 'reachable_time': 33851, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230972, 'error': None, 'target': 'ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.254 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f031beb2-1148-496b-aee1-e1596cfb573b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.314 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89017d5b-afcd-4061-8eab-775b068742a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.316 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d4dc6e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.316 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.317 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap52d4dc6e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:35 compute-0 NetworkManager[51733]: <info>  [1759267775.3202] manager: (tap52d4dc6e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Sep 30 21:29:35 compute-0 kernel: tap52d4dc6e-c0: entered promiscuous mode
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.330 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap52d4dc6e-c0, col_values=(('external_ids', {'iface-id': 'ac9dab42-dfa4-4e22-8469-858d37bf4155'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:35 compute-0 ovn_controller[94912]: 2025-09-30T21:29:35Z|00273|binding|INFO|Releasing lport ac9dab42-dfa4-4e22-8469-858d37bf4155 from this chassis (sb_readonly=0)
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.338 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/52d4dc6e-c68c-4c9f-816b-010b74312e25.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/52d4dc6e-c68c-4c9f-816b-010b74312e25.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.339 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1130e347-211e-44d7-8816-3f6106e04883]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.341 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-52d4dc6e-c68c-4c9f-816b-010b74312e25
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/52d4dc6e-c68c-4c9f-816b-010b74312e25.pid.haproxy
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 52d4dc6e-c68c-4c9f-816b-010b74312e25
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:29:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:35.341 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'env', 'PROCESS_TAG=haproxy-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/52d4dc6e-c68c-4c9f-816b-010b74312e25.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:35 compute-0 podman[231004]: 2025-09-30 21:29:35.692234083 +0000 UTC m=+0.056881619 container create 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:35 compute-0 systemd[1]: Started libpod-conmon-336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4.scope.
Sep 30 21:29:35 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:29:35 compute-0 podman[231004]: 2025-09-30 21:29:35.658784067 +0000 UTC m=+0.023431623 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:29:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52048684c3e20a0db3fe965d1319d21e0244c2767d946654c8fcdc1d66fc23e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:29:35 compute-0 podman[231004]: 2025-09-30 21:29:35.771453235 +0000 UTC m=+0.136100791 container init 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:29:35 compute-0 podman[231004]: 2025-09-30 21:29:35.779473668 +0000 UTC m=+0.144121224 container start 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:29:35 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [NOTICE]   (231030) : New worker (231032) forked
Sep 30 21:29:35 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [NOTICE]   (231030) : Loading success.
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.869 2 DEBUG nova.compute.manager [req-f50be279-06e1-4cde-83db-69d178bcabc1 req-3ed583e9-e24c-4e20-8307-b799da73a2ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.869 2 DEBUG oslo_concurrency.lockutils [req-f50be279-06e1-4cde-83db-69d178bcabc1 req-3ed583e9-e24c-4e20-8307-b799da73a2ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.869 2 DEBUG oslo_concurrency.lockutils [req-f50be279-06e1-4cde-83db-69d178bcabc1 req-3ed583e9-e24c-4e20-8307-b799da73a2ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.869 2 DEBUG oslo_concurrency.lockutils [req-f50be279-06e1-4cde-83db-69d178bcabc1 req-3ed583e9-e24c-4e20-8307-b799da73a2ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:35 compute-0 nova_compute[192810]: 2025-09-30 21:29:35.870 2 DEBUG nova.compute.manager [req-f50be279-06e1-4cde-83db-69d178bcabc1 req-3ed583e9-e24c-4e20-8307-b799da73a2ab dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Processing event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.168 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.169 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267776.1683083, 326f57c1-51b5-4fdb-ac31-2d754684b734 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.169 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] VM Started (Lifecycle Event)
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.176 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.180 2 INFO nova.virt.libvirt.driver [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Instance spawned successfully.
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.181 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.187 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.189 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.196 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.197 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.197 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.197 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.198 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.198 2 DEBUG nova.virt.libvirt.driver [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.204 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.205 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267776.1684792, 326f57c1-51b5-4fdb-ac31-2d754684b734 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.205 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] VM Paused (Lifecycle Event)
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.231 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.235 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267776.1720333, 326f57c1-51b5-4fdb-ac31-2d754684b734 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.235 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] VM Resumed (Lifecycle Event)
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.249 2 INFO nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Took 6.31 seconds to spawn the instance on the hypervisor.
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.249 2 DEBUG nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.255 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.258 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.280 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.323 2 INFO nova.compute.manager [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Took 7.09 seconds to build instance.
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.339 2 DEBUG oslo_concurrency.lockutils [None req-60993228-7952-4dc9-8b5f-108dfae573ad e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.431 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.452 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Triggering sync for uuid 3d704b04-4399-45ea-bc89-6cf74b4dec72 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.453 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Triggering sync for uuid 326f57c1-51b5-4fdb-ac31-2d754684b734 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.453 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.453 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.454 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.454 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.484 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.486 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.660 2 DEBUG nova.network.neutron [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updated VIF entry in instance network info cache for port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.660 2 DEBUG nova.network.neutron [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updating instance_info_cache with network_info: [{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:36 compute-0 nova_compute[192810]: 2025-09-30 21:29:36.694 2 DEBUG oslo_concurrency.lockutils [req-d9221796-41fa-4296-b031-7083643cbdae req-8c882806-4074-42b1-a030-864ec89adb59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.965 2 DEBUG nova.compute.manager [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.965 2 DEBUG oslo_concurrency.lockutils [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.966 2 DEBUG oslo_concurrency.lockutils [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.967 2 DEBUG oslo_concurrency.lockutils [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.967 2 DEBUG nova.compute.manager [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] No waiting events found dispatching network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:37 compute-0 nova_compute[192810]: 2025-09-30 21:29:37.968 2 WARNING nova.compute.manager [req-454f0e56-fdd4-4dbd-a522-5ea0746fd970 req-82528f37-a816-494b-926f-f520f4cbbfb8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received unexpected event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 for instance with vm_state active and task_state None.
Sep 30 21:29:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:38.734 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:38.735 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:38.735 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:39.756 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.982 2 DEBUG nova.compute.manager [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-changed-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.982 2 DEBUG nova.compute.manager [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Refreshing instance network info cache due to event network-changed-653ddbd9-e7e8-481f-a8cf-875e0ca3e139. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.983 2 DEBUG oslo_concurrency.lockutils [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.983 2 DEBUG oslo_concurrency.lockutils [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:39 compute-0 nova_compute[192810]: 2025-09-30 21:29:39.983 2 DEBUG nova.network.neutron [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Refreshing network info cache for port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:40 compute-0 nova_compute[192810]: 2025-09-30 21:29:40.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:40 compute-0 nova_compute[192810]: 2025-09-30 21:29:40.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.688 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.689 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.689 2 DEBUG nova.network.neutron [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.733 2 DEBUG nova.network.neutron [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updated VIF entry in instance network info cache for port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.733 2 DEBUG nova.network.neutron [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updating instance_info_cache with network_info: [{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.757 2 DEBUG oslo_concurrency.lockutils [req-1095a6ba-5fff-4439-95c4-6552c6458c08 req-7a6e0a65-2e0d-4058-8f7e-eb5436ea26dd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:41 compute-0 nova_compute[192810]: 2025-09-30 21:29:41.800 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:41 compute-0 sshd-session[231041]: Invalid user wd from 45.81.23.80 port 60260
Sep 30 21:29:41 compute-0 sshd-session[231041]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:29:41 compute-0 sshd-session[231041]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:29:42 compute-0 nova_compute[192810]: 2025-09-30 21:29:42.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.760 2 DEBUG nova.network.neutron [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.792 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:43 compute-0 sshd-session[231041]: Failed password for invalid user wd from 45.81.23.80 port 60260 ssh2
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.945 2 DEBUG nova.virt.libvirt.driver [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.946 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Creating file /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/10f946080a5d4d298936b24f6930a1d4.tmp on remote host 192.168.122.102 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:29:43 compute-0 nova_compute[192810]: 2025-09-30 21:29:43.946 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/10f946080a5d4d298936b24f6930a1d4.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:44 compute-0 podman[231045]: 2025-09-30 21:29:44.319203206 +0000 UTC m=+0.059933156 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:29:44 compute-0 podman[231044]: 2025-09-30 21:29:44.342937326 +0000 UTC m=+0.083835490 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.457 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/10f946080a5d4d298936b24f6930a1d4.tmp" returned: 1 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.457 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] 'ssh -o BatchMode=yes 192.168.122.102 touch /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/10f946080a5d4d298936b24f6930a1d4.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.457 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Creating directory /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72 on remote host 192.168.122.102 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.458 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.667 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.102 mkdir -p /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72" returned: 0 in 0.209s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.672 2 DEBUG nova.virt.libvirt.driver [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:44 compute-0 nova_compute[192810]: 2025-09-30 21:29:44.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:29:45 compute-0 sshd-session[231041]: Received disconnect from 45.81.23.80 port 60260:11: Bye Bye [preauth]
Sep 30 21:29:45 compute-0 sshd-session[231041]: Disconnected from invalid user wd 45.81.23.80 port 60260 [preauth]
Sep 30 21:29:45 compute-0 nova_compute[192810]: 2025-09-30 21:29:45.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.812 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.814 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:29:46 compute-0 kernel: tap69480967-0a (unregistering): left promiscuous mode
Sep 30 21:29:46 compute-0 NetworkManager[51733]: <info>  [1759267786.8414] device (tap69480967-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:29:46 compute-0 ovn_controller[94912]: 2025-09-30T21:29:46Z|00274|binding|INFO|Releasing lport 69480967-0a46-4a33-95ec-9633390ec042 from this chassis (sb_readonly=0)
Sep 30 21:29:46 compute-0 ovn_controller[94912]: 2025-09-30T21:29:46Z|00275|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 down in Southbound
Sep 30 21:29:46 compute-0 ovn_controller[94912]: 2025-09-30T21:29:46Z|00276|binding|INFO|Removing iface tap69480967-0a ovn-installed in OVS
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:46.860 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d8:eb 10.100.0.14'], port_security=['fa:16:3e:d1:d8:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3d704b04-4399-45ea-bc89-6cf74b4dec72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=69480967-0a46-4a33-95ec-9633390ec042) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:46.861 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 69480967-0a46-4a33-95ec-9633390ec042 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:29:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:46.863 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:29:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:46.864 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b061f191-6ff3-4039-b232-bc8609269e70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:46.865 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:29:46 compute-0 nova_compute[192810]: 2025-09-30 21:29:46.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:46 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000049.scope: Deactivated successfully.
Sep 30 21:29:46 compute-0 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000049.scope: Consumed 13.575s CPU time.
Sep 30 21:29:46 compute-0 systemd-machined[152794]: Machine qemu-34-instance-00000049 terminated.
Sep 30 21:29:46 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [NOTICE]   (230768) : haproxy version is 2.8.14-c23fe91
Sep 30 21:29:46 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [NOTICE]   (230768) : path to executable is /usr/sbin/haproxy
Sep 30 21:29:46 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [WARNING]  (230768) : Exiting Master process...
Sep 30 21:29:46 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [ALERT]    (230768) : Current worker (230770) exited with code 143 (Terminated)
Sep 30 21:29:46 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[230764]: [WARNING]  (230768) : All workers exited. Exiting... (0)
Sep 30 21:29:46 compute-0 systemd[1]: libpod-58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb.scope: Deactivated successfully.
Sep 30 21:29:46 compute-0 podman[231116]: 2025-09-30 21:29:46.996601595 +0000 UTC m=+0.043775527 container died 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:29:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb-userdata-shm.mount: Deactivated successfully.
Sep 30 21:29:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-6c2f8134a762b8d15a3a37f7d84c752b6011dfba38259f8f5fa12367c8248011-merged.mount: Deactivated successfully.
Sep 30 21:29:47 compute-0 podman[231116]: 2025-09-30 21:29:47.045897302 +0000 UTC m=+0.093071234 container cleanup 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:29:47 compute-0 systemd[1]: libpod-conmon-58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb.scope: Deactivated successfully.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.076 2 DEBUG nova.compute.manager [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.077 2 DEBUG oslo_concurrency.lockutils [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.077 2 DEBUG oslo_concurrency.lockutils [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.078 2 DEBUG oslo_concurrency.lockutils [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.078 2 DEBUG nova.compute.manager [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.078 2 WARNING nova.compute.manager [req-4bbf5738-55dc-4c69-9c92-5dde7368e9e4 req-1398f919-069c-4956-a5ab-fb85ed57aede dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.084 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 podman[231129]: 2025-09-30 21:29:47.106834312 +0000 UTC m=+0.090027567 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:29:47 compute-0 podman[231158]: 2025-09-30 21:29:47.128795707 +0000 UTC m=+0.059907415 container remove 58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.144 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e12ba3d2-d14d-4ad7-a77a-e45534bd0dd0]: (4, ('Tue Sep 30 09:29:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb)\n58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb\nTue Sep 30 09:29:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb)\n58b50800ea9b6bd3da8e9d8fbcc6f94989e960832cb14553d25965bb922647cb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.146 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f5c690-228e-4188-8318-d9de1c4ee79c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.147 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.151 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.152 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.175 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1751d134-243a-4b83-8297-fc1c3b7d56f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.206 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[435d9cc2-01f2-4904-9eec-14a12d17b13a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.207 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b50acf43-30e0-43cc-8255-e0dbd89b7b93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.209 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.214 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c1408da0-e1a0-4896-be27-6d165935b079]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445325, 'reachable_time': 32427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231204, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.225 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:29:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:47.226 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[280111c4-a27f-4a4b-8950-d8eea8fa425c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:47 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.286 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.287 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.365 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.527 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.529 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5431MB free_disk=73.28860473632812GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.529 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.530 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.584 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating resource usage from migration 2b7e9cbb-3537-441b-ad89-6d7f39e2f29a
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.607 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 326f57c1-51b5-4fdb-ac31-2d754684b734 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.607 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration 2b7e9cbb-3537-441b-ad89-6d7f39e2f29a is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.607 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.608 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.688 2 INFO nova.virt.libvirt.driver [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance shutdown successfully after 3 seconds.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.693 2 INFO nova.virt.libvirt.driver [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance destroyed successfully.
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.694 2 DEBUG nova.virt.libvirt.vif [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:29:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:d1:d8:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.694 2 DEBUG nova.network.os_vif_util [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-253023348-network", "vif_mac": "fa:16:3e:d1:d8:eb"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.694 2 DEBUG nova.network.os_vif_util [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.695 2 DEBUG os_vif [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.696 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69480967-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.703 2 INFO os_vif [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a')
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.706 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.708 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.725 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.745 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.746 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.764 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.765 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.840 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.842 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Copying file /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk to 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:29:47 compute-0 nova_compute[192810]: 2025-09-30 21:29:47.842 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:47 compute-0 ovn_controller[94912]: 2025-09-30T21:29:47Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:25:7b 10.100.0.8
Sep 30 21:29:47 compute-0 ovn_controller[94912]: 2025-09-30T21:29:47Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:25:7b 10.100.0.8
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.536 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "scp -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk" returned: 0 in 0.694s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.538 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Copying file /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.config to 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.538 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.742 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.803 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "scp -C -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.config 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.804 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Copying file /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.info to 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:29:48 compute-0 nova_compute[192810]: 2025-09-30 21:29:48.804 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.101 2 DEBUG oslo_concurrency.processutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "scp -C -r /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_resize/disk.info 192.168.122.102:/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.651 2 DEBUG neutronclient.v2_0.client [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 69480967-0a46-4a33-95ec-9633390ec042 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.759 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.760 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.760 2 DEBUG oslo_concurrency.lockutils [None req-a5c6d55b-bade-4b62-a880-5d0b8ab2123e 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:29:49 compute-0 nova_compute[192810]: 2025-09-30 21:29:49.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.685 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.685 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.686 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.686 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 326f57c1-51b5-4fdb-ac31-2d754684b734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.796 2 DEBUG nova.compute.manager [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.797 2 DEBUG oslo_concurrency.lockutils [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.797 2 DEBUG oslo_concurrency.lockutils [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.798 2 DEBUG oslo_concurrency.lockutils [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.798 2 DEBUG nova.compute.manager [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:50 compute-0 nova_compute[192810]: 2025-09-30 21:29:50.798 2 WARNING nova.compute.manager [req-142dbf34-e52c-4398-a273-0b56156e0840 req-65897a9d-378a-419c-b810-44cef728def9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.873 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updating instance_info_cache with network_info: [{"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.892 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-326f57c1-51b5-4fdb-ac31-2d754684b734" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.892 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.892 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.893 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:52 compute-0 nova_compute[192810]: 2025-09-30 21:29:52.893 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:53 compute-0 nova_compute[192810]: 2025-09-30 21:29:53.831 2 DEBUG nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-changed-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:53 compute-0 nova_compute[192810]: 2025-09-30 21:29:53.831 2 DEBUG nova.compute.manager [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing instance network info cache due to event network-changed-69480967-0a46-4a33-95ec-9633390ec042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:29:53 compute-0 nova_compute[192810]: 2025-09-30 21:29:53.831 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:29:53 compute-0 nova_compute[192810]: 2025-09-30 21:29:53.832 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:29:53 compute-0 nova_compute[192810]: 2025-09-30 21:29:53.832 2 DEBUG nova.network.neutron [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing network info cache for port 69480967-0a46-4a33-95ec-9633390ec042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:29:54 compute-0 nova_compute[192810]: 2025-09-30 21:29:54.898 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.260 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.260 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.261 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.261 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.261 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.273 2 INFO nova.compute.manager [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Terminating instance
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.285 2 DEBUG nova.compute.manager [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:29:55 compute-0 kernel: tap653ddbd9-e7 (unregistering): left promiscuous mode
Sep 30 21:29:55 compute-0 NetworkManager[51733]: <info>  [1759267795.3062] device (tap653ddbd9-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 ovn_controller[94912]: 2025-09-30T21:29:55Z|00277|binding|INFO|Releasing lport 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 from this chassis (sb_readonly=0)
Sep 30 21:29:55 compute-0 ovn_controller[94912]: 2025-09-30T21:29:55Z|00278|binding|INFO|Setting lport 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 down in Southbound
Sep 30 21:29:55 compute-0 ovn_controller[94912]: 2025-09-30T21:29:55Z|00279|binding|INFO|Removing iface tap653ddbd9-e7 ovn-installed in OVS
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.322 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:25:7b 10.100.0.8'], port_security=['fa:16:3e:4e:25:7b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '326f57c1-51b5-4fdb-ac31-2d754684b734', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ff84dd76a06841f3bdcc13f83407b7ba', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b0495b63-c487-4aa6-9461-7d07efab628b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.188'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0dfab36b-33d9-41d6-bc0e-727f1f2b3d16, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=653ddbd9-e7e8-481f-a8cf-875e0ca3e139) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.324 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 653ddbd9-e7e8-481f-a8cf-875e0ca3e139 in datapath 52d4dc6e-c68c-4c9f-816b-010b74312e25 unbound from our chassis
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.325 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 52d4dc6e-c68c-4c9f-816b-010b74312e25, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.327 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1419a2-cf1b-4823-b49b-ca935c4d0f2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.328 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25 namespace which is not needed anymore
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Deactivated successfully.
Sep 30 21:29:55 compute-0 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d0000004b.scope: Consumed 13.518s CPU time.
Sep 30 21:29:55 compute-0 systemd-machined[152794]: Machine qemu-35-instance-0000004b terminated.
Sep 30 21:29:55 compute-0 podman[231259]: 2025-09-30 21:29:55.480510561 +0000 UTC m=+0.064704666 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Sep 30 21:29:55 compute-0 podman[231258]: 2025-09-30 21:29:55.48716556 +0000 UTC m=+0.071810706 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:29:55 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [NOTICE]   (231030) : haproxy version is 2.8.14-c23fe91
Sep 30 21:29:55 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [NOTICE]   (231030) : path to executable is /usr/sbin/haproxy
Sep 30 21:29:55 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [WARNING]  (231030) : Exiting Master process...
Sep 30 21:29:55 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [ALERT]    (231030) : Current worker (231032) exited with code 143 (Terminated)
Sep 30 21:29:55 compute-0 neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25[231025]: [WARNING]  (231030) : All workers exited. Exiting... (0)
Sep 30 21:29:55 compute-0 systemd[1]: libpod-336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4.scope: Deactivated successfully.
Sep 30 21:29:55 compute-0 conmon[231025]: conmon 336b98c445aaf651001f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4.scope/container/memory.events
Sep 30 21:29:55 compute-0 podman[231279]: 2025-09-30 21:29:55.511680859 +0000 UTC m=+0.064948052 container died 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:29:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:29:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-52048684c3e20a0db3fe965d1319d21e0244c2767d946654c8fcdc1d66fc23e1-merged.mount: Deactivated successfully.
Sep 30 21:29:55 compute-0 podman[231279]: 2025-09-30 21:29:55.566378292 +0000 UTC m=+0.119645455 container cleanup 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.569 2 INFO nova.virt.libvirt.driver [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Instance destroyed successfully.
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.569 2 DEBUG nova.objects.instance [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lazy-loading 'resources' on Instance uuid 326f57c1-51b5-4fdb-ac31-2d754684b734 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:29:55 compute-0 systemd[1]: libpod-conmon-336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4.scope: Deactivated successfully.
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.587 2 DEBUG nova.virt.libvirt.vif [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=2.2.2.2,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:29:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='guest-instance-1.domain.com',display_name='guest-instance-1.domain.com',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='guest-instance-1-domain-com',id=75,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/1IBUkkyaU5oPysNsJ8gPw3vUlBbwFlnKGKVOWQScO88HvqBZTV0CnXUyBRjZdY1EyVymgB+KaUMUZGHsrL7tRlWYMuPLiCwJG5fIXmNt4g9/kThbknLPhlR/gWcBiVA==',key_name='tempest-keypair-1094843309',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:36Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ff84dd76a06841f3bdcc13f83407b7ba',ramdisk_id='',reservation_id='r-7cs01tb6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestFqdnHostnames-1444957193',owner_user_name='tempest-ServersTestFqdnHostnames-1444957193-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:29:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='e2eec3e45b7946af8db04ebcaf028f7a',uuid=326f57c1-51b5-4fdb-ac31-2d754684b734,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.588 2 DEBUG nova.network.os_vif_util [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converting VIF {"id": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "address": "fa:16:3e:4e:25:7b", "network": {"id": "52d4dc6e-c68c-4c9f-816b-010b74312e25", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1072711906-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ff84dd76a06841f3bdcc13f83407b7ba", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap653ddbd9-e7", "ovs_interfaceid": "653ddbd9-e7e8-481f-a8cf-875e0ca3e139", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.589 2 DEBUG nova.network.os_vif_util [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.589 2 DEBUG os_vif [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap653ddbd9-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.596 2 INFO os_vif [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:25:7b,bridge_name='br-int',has_traffic_filtering=True,id=653ddbd9-e7e8-481f-a8cf-875e0ca3e139,network=Network(52d4dc6e-c68c-4c9f-816b-010b74312e25),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap653ddbd9-e7')
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.597 2 INFO nova.virt.libvirt.driver [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Deleting instance files /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734_del
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.597 2 INFO nova.virt.libvirt.driver [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Deletion of /var/lib/nova/instances/326f57c1-51b5-4fdb-ac31-2d754684b734_del complete
Sep 30 21:29:55 compute-0 podman[231345]: 2025-09-30 21:29:55.628757258 +0000 UTC m=+0.040489694 container remove 336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.633 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[26aefeb0-d1d1-4964-b175-71e6c0fd54d9]: (4, ('Tue Sep 30 09:29:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25 (336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4)\n336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4\nTue Sep 30 09:29:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25 (336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4)\n336b98c445aaf651001f7262435eeb7206266b3f6416e3c7a10bc921d2e67ad4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.635 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d12facd4-197b-4111-8053-3460e193534c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.636 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap52d4dc6e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 kernel: tap52d4dc6e-c0: left promiscuous mode
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.642 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[18627ba8-1cd7-405d-821d-3e68197cc1b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.671 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ff598afe-ef2c-4828-b75e-46c60341df1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.673 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[916c37c1-57ac-4bf4-9ea2-8cfecda0bc09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.685 2 DEBUG nova.compute.manager [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-unplugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.686 2 DEBUG oslo_concurrency.lockutils [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.686 2 DEBUG oslo_concurrency.lockutils [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.686 2 DEBUG oslo_concurrency.lockutils [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.686 2 DEBUG nova.compute.manager [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] No waiting events found dispatching network-vif-unplugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.687 2 DEBUG nova.compute.manager [req-80d1562b-9f42-4f31-ae92-13d7e537aa77 req-ee504e3d-a62c-4f6a-a84e-caefa5299251 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-unplugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.687 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d633b147-b0db-4b81-9f73-c8e63dc8b074]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 447071, 'reachable_time': 16304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231359, 'error': None, 'target': 'ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.688 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-52d4dc6e-c68c-4c9f-816b-010b74312e25 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:29:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:29:55.689 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[9564573e-42df-4876-bd09-5dbe03af8e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:29:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d52d4dc6e\x2dc68c\x2d4c9f\x2d816b\x2d010b74312e25.mount: Deactivated successfully.
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.694 2 INFO nova.compute.manager [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.694 2 DEBUG oslo.service.loopingcall [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.695 2 DEBUG nova.compute.manager [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:29:55 compute-0 nova_compute[192810]: 2025-09-30 21:29:55.695 2 DEBUG nova.network.neutron [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:29:56 compute-0 nova_compute[192810]: 2025-09-30 21:29:56.734 2 DEBUG nova.network.neutron [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updated VIF entry in instance network info cache for port 69480967-0a46-4a33-95ec-9633390ec042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:29:56 compute-0 nova_compute[192810]: 2025-09-30 21:29:56.735 2 DEBUG nova.network.neutron [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:56 compute-0 nova_compute[192810]: 2025-09-30 21:29:56.768 2 DEBUG oslo_concurrency.lockutils [req-6e3f028a-9eba-489e-889b-5e5b78ed9eb7 req-4151bbe9-0df9-4d70-892d-3e732a4481a8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.833 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.834 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.834 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.834 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.835 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] No waiting events found dispatching network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.835 2 WARNING nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received unexpected event network-vif-plugged-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 for instance with vm_state active and task_state deleting.
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.835 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.836 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.836 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.836 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.836 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.837 2 WARNING nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state resized and task_state None.
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.837 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.837 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.837 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.838 2 DEBUG oslo_concurrency.lockutils [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.838 2 DEBUG nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:29:57 compute-0 nova_compute[192810]: 2025-09-30 21:29:57.838 2 WARNING nova.compute.manager [req-fab29cdf-3b86-4574-8bac-dfa47ea5a987 req-d0a799c3-4441-4f9a-88ef-b3d00df1255c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state resized and task_state None.
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.131 2 DEBUG nova.network.neutron [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.153 2 INFO nova.compute.manager [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Took 2.46 seconds to deallocate network for instance.
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.218 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.218 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.225 2 DEBUG nova.compute.manager [req-1b1063d0-dab1-4939-9643-8966eed85fba req-92ea98f7-2f1e-4fbb-a45c-f8a1656958cb dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Received event network-vif-deleted-653ddbd9-e7e8-481f-a8cf-875e0ca3e139 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.294 2 DEBUG nova.compute.provider_tree [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.308 2 DEBUG nova.scheduler.client.report [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.326 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.358 2 INFO nova.scheduler.client.report [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Deleted allocations for instance 326f57c1-51b5-4fdb-ac31-2d754684b734
Sep 30 21:29:58 compute-0 nova_compute[192810]: 2025-09-30 21:29:58.476 2 DEBUG oslo_concurrency.lockutils [None req-2ed6b1d5-6966-4ec4-9649-2472c7bb9167 e2eec3e45b7946af8db04ebcaf028f7a ff84dd76a06841f3bdcc13f83407b7ba - - default default] Lock "326f57c1-51b5-4fdb-ac31-2d754684b734" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:00 compute-0 nova_compute[192810]: 2025-09-30 21:30:00.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:01 compute-0 nova_compute[192810]: 2025-09-30 21:30:01.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:01 compute-0 nova_compute[192810]: 2025-09-30 21:30:01.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.131 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267787.129874, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.131 2 INFO nova.compute.manager [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Stopped (Lifecycle Event)
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.154 2 DEBUG nova.compute.manager [None req-483a0c1a-2a09-456b-8941-ead392718a6b - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.156 2 DEBUG nova.compute.manager [None req-483a0c1a-2a09-456b-8941-ead392718a6b - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.177 2 INFO nova.compute.manager [None req-483a0c1a-2a09-456b-8941-ead392718a6b - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-0.ctlplane.example.com
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.400 2 INFO nova.compute.manager [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Swapping old allocation on dict_keys(['fe423b93-de5a-41f7-97d1-9622ea46af54']) held by migration 2b7e9cbb-3537-441b-ad89-6d7f39e2f29a for instance
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.447 2 DEBUG nova.scheduler.client.report [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Overwriting current allocation {'allocations': {'43534f47-0785-4193-83a7-711a984caccc': {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}, 'generation': 36}}, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'consumer_generation': 1} on consumer 3d704b04-4399-45ea-bc89-6cf74b4dec72 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Sep 30 21:30:02 compute-0 nova_compute[192810]: 2025-09-30 21:30:02.669 2 INFO nova.network.neutron [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating port 69480967-0a46-4a33-95ec-9633390ec042 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.259 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.259 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.259 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.259 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 WARNING nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.260 2 DEBUG oslo_concurrency.lockutils [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.261 2 DEBUG nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:03 compute-0 nova_compute[192810]: 2025-09-30 21:30:03.261 2 WARNING nova.compute.manager [req-afa72bb2-5b41-4cba-961e-511196a20d06 req-9a124992-7737-490e-bde7-2728b4acf79a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:30:03 compute-0 podman[231362]: 2025-09-30 21:30:03.350262396 +0000 UTC m=+0.075027048 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:30:03 compute-0 podman[231361]: 2025-09-30 21:30:03.35714596 +0000 UTC m=+0.082177728 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:30:03 compute-0 podman[231363]: 2025-09-30 21:30:03.38761326 +0000 UTC m=+0.109837098 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.435 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.435 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.436 2 DEBUG nova.network.neutron [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.554 2 DEBUG nova.compute.manager [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-changed-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.555 2 DEBUG nova.compute.manager [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing instance network info cache due to event network-changed-69480967-0a46-4a33-95ec-9633390ec042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:04 compute-0 nova_compute[192810]: 2025-09-30 21:30:04.555 2 DEBUG oslo_concurrency.lockutils [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:05 compute-0 nova_compute[192810]: 2025-09-30 21:30:05.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.436 2 DEBUG nova.network.neutron [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.460 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.461 2 DEBUG nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.465 2 DEBUG oslo_concurrency.lockutils [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.466 2 DEBUG nova.network.neutron [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Refreshing network info cache for port 69480967-0a46-4a33-95ec-9633390ec042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.476 2 DEBUG nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start _get_guest_xml network_info=[{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.483 2 WARNING nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.494 2 DEBUG nova.virt.libvirt.host [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.495 2 DEBUG nova.virt.libvirt.host [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.540 2 DEBUG nova.virt.libvirt.host [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.541 2 DEBUG nova.virt.libvirt.host [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.543 2 DEBUG nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.544 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.544 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.545 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.546 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.546 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.546 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.547 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.548 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.548 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.549 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.549 2 DEBUG nova.virt.hardware [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.550 2 DEBUG nova.objects.instance [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3d704b04-4399-45ea-bc89-6cf74b4dec72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.569 2 DEBUG oslo_concurrency.processutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.664 2 DEBUG oslo_concurrency.processutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.666 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.666 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.667 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.669 2 DEBUG nova.virt.libvirt.vif [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.669 2 DEBUG nova.network.os_vif_util [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.670 2 DEBUG nova.network.os_vif_util [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.673 2 DEBUG nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <uuid>3d704b04-4399-45ea-bc89-6cf74b4dec72</uuid>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <name>instance-00000049</name>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestJSON-server-22148059</nova:name>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:30:06</nova:creationTime>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         <nova:port uuid="69480967-0a46-4a33-95ec-9633390ec042">
Sep 30 21:30:06 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <system>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="serial">3d704b04-4399-45ea-bc89-6cf74b4dec72</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="uuid">3d704b04-4399-45ea-bc89-6cf74b4dec72</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </system>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <os>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </os>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <features>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </features>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/disk.config"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:d1:d8:eb"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <target dev="tap69480967-0a"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72/console.log" append="off"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <video>
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </video>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:30:06 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:30:06 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:30:06 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:30:06 compute-0 nova_compute[192810]: </domain>
Sep 30 21:30:06 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.674 2 DEBUG nova.compute.manager [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Preparing to wait for external event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.675 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.675 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.675 2 DEBUG oslo_concurrency.lockutils [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.676 2 DEBUG nova.virt.libvirt.vif [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:29:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.676 2 DEBUG nova.network.os_vif_util [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.677 2 DEBUG nova.network.os_vif_util [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.677 2 DEBUG os_vif [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.679 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.679 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.681 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69480967-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.682 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69480967-0a, col_values=(('external_ids', {'iface-id': '69480967-0a46-4a33-95ec-9633390ec042', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:d8:eb', 'vm-uuid': '3d704b04-4399-45ea-bc89-6cf74b4dec72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.6861] manager: (tap69480967-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.692 2 INFO os_vif [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a')
Sep 30 21:30:06 compute-0 kernel: tap69480967-0a: entered promiscuous mode
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.7587] manager: (tap69480967-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Sep 30 21:30:06 compute-0 ovn_controller[94912]: 2025-09-30T21:30:06Z|00280|binding|INFO|Claiming lport 69480967-0a46-4a33-95ec-9633390ec042 for this chassis.
Sep 30 21:30:06 compute-0 ovn_controller[94912]: 2025-09-30T21:30:06Z|00281|binding|INFO|69480967-0a46-4a33-95ec-9633390ec042: Claiming fa:16:3e:d1:d8:eb 10.100.0.14
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.7716] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.7722] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.772 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d8:eb 10.100.0.14'], port_security=['fa:16:3e:d1:d8:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3d704b04-4399-45ea-bc89-6cf74b4dec72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=69480967-0a46-4a33-95ec-9633390ec042) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.773 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 69480967-0a46-4a33-95ec-9633390ec042 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.774 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.784 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ff449ffc-a2c8-428c-aeef-8e885c0ca39c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.785 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.787 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.787 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[91d95138-567e-4ecb-8d1d-7b087d7d22fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.787 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6dfe1c3d-5248-4879-a9cb-20c9801de432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 systemd-machined[152794]: New machine qemu-36-instance-00000049.
Sep 30 21:30:06 compute-0 systemd-udevd[231446]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.797 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4992602b-6e72-4127-bfbe-bb885a5ac32c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.8113] device (tap69480967-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.8122] device (tap69480967-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.830 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5242dc33-0abc-4964-8514-6054f7cbbf4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.855 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[085934b2-fb25-418c-9d16-b2f536a6e28c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.874 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1f6bde77-3957-4982-b885-1fbbb6fdcebb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 systemd[1]: Started Virtual Machine qemu-36-instance-00000049.
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.8756] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.901 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3f4ade-8928-4cd7-a057-2e495d7ac1c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_controller[94912]: 2025-09-30T21:30:06Z|00282|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 ovn-installed in OVS
Sep 30 21:30:06 compute-0 ovn_controller[94912]: 2025-09-30T21:30:06Z|00283|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 up in Southbound
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.903 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9be5907b-b801-487b-9176-338198b089f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 nova_compute[192810]: 2025-09-30 21:30:06.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:06 compute-0 NetworkManager[51733]: <info>  [1759267806.9254] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.931 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[28401ddc-4e9e-461c-a36a-5d63985d8339]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.946 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e6000aa1-c015-4dab-96c4-caf28611d8ba]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450254, 'reachable_time': 18315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231476, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.965 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[029daa94-1952-4d0b-82b0-7a1245a425a7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450254, 'tstamp': 450254}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231478, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:06.982 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e8670a91-3452-424f-be89-e80beeb6ece4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450254, 'reachable_time': 18315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231479, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.014 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b23f85f4-b392-4aee-8cf7-44320d39cb40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.076 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddf4770-bcb9-4410-b8cf-e2701144d387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.077 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.077 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.078 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-0 NetworkManager[51733]: <info>  [1759267807.0817] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.084 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:07 compute-0 ovn_controller[94912]: 2025-09-30T21:30:07Z|00284|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.087 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.088 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9eccaa9a-be09-481c-8d4c-b9f1b599c4a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.089 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:07.089 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.158 2 DEBUG nova.compute.manager [req-9bf6e126-2508-4377-b29f-408bf6269193 req-1d2678c9-1493-4623-bba8-eaa7b7cd1e26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.159 2 DEBUG oslo_concurrency.lockutils [req-9bf6e126-2508-4377-b29f-408bf6269193 req-1d2678c9-1493-4623-bba8-eaa7b7cd1e26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.159 2 DEBUG oslo_concurrency.lockutils [req-9bf6e126-2508-4377-b29f-408bf6269193 req-1d2678c9-1493-4623-bba8-eaa7b7cd1e26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.159 2 DEBUG oslo_concurrency.lockutils [req-9bf6e126-2508-4377-b29f-408bf6269193 req-1d2678c9-1493-4623-bba8-eaa7b7cd1e26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.160 2 DEBUG nova.compute.manager [req-9bf6e126-2508-4377-b29f-408bf6269193 req-1d2678c9-1493-4623-bba8-eaa7b7cd1e26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Processing event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:07 compute-0 podman[231518]: 2025-09-30 21:30:07.433252841 +0000 UTC m=+0.046900706 container create a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:30:07 compute-0 systemd[1]: Started libpod-conmon-a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3.scope.
Sep 30 21:30:07 compute-0 podman[231518]: 2025-09-30 21:30:07.410159748 +0000 UTC m=+0.023807633 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:30:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c14a4eae6649a66cc190d15f0f725a7365f54967cfcf0a67af3469b1e0f683fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:07 compute-0 podman[231518]: 2025-09-30 21:30:07.522364344 +0000 UTC m=+0.136012219 container init a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:30:07 compute-0 podman[231518]: 2025-09-30 21:30:07.527129174 +0000 UTC m=+0.140777039 container start a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.543 2 DEBUG nova.compute.manager [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.544 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267807.5430722, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.545 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Started (Lifecycle Event)
Sep 30 21:30:07 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [NOTICE]   (231538) : New worker (231540) forked
Sep 30 21:30:07 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [NOTICE]   (231538) : Loading success.
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.554 2 INFO nova.virt.libvirt.driver [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance running successfully.
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.555 2 DEBUG nova.virt.libvirt.driver [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.562 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.566 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.584 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.585 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267807.5438774, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.585 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Paused (Lifecycle Event)
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.624 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.628 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267807.5477018, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.628 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Resumed (Lifecycle Event)
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.657 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.661 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.667 2 INFO nova.compute.manager [None req-6c4d314b-eafa-4012-ac4d-cf348dff93af 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance to original state: 'active'
Sep 30 21:30:07 compute-0 nova_compute[192810]: 2025-09-30 21:30:07.707 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Sep 30 21:30:08 compute-0 nova_compute[192810]: 2025-09-30 21:30:08.484 2 DEBUG nova.network.neutron [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updated VIF entry in instance network info cache for port 69480967-0a46-4a33-95ec-9633390ec042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:08 compute-0 nova_compute[192810]: 2025-09-30 21:30:08.485 2 DEBUG nova.network.neutron [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [{"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:08 compute-0 nova_compute[192810]: 2025-09-30 21:30:08.624 2 DEBUG oslo_concurrency.lockutils [req-ea21fc4b-d278-4a87-bf01-f8f0e4f4698e req-ebb8f029-7a0b-4fe4-bb6f-005cdc0b1063 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3d704b04-4399-45ea-bc89-6cf74b4dec72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.341 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.342 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.343 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.343 2 DEBUG oslo_concurrency.lockutils [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.343 2 DEBUG nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.344 2 WARNING nova.compute.manager [req-6bc33ce8-5ae3-4d5a-99ca-f54de2a79431 req-ab233fe7-b0c8-4fbe-a6d7-146e8d5fa9e0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state active and task_state None.
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.998 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:09 compute-0 nova_compute[192810]: 2025-09-30 21:30:09.999 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.000 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.000 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.000 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.013 2 INFO nova.compute.manager [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Terminating instance
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.022 2 DEBUG nova.compute.manager [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:30:10 compute-0 kernel: tap69480967-0a (unregistering): left promiscuous mode
Sep 30 21:30:10 compute-0 NetworkManager[51733]: <info>  [1759267810.0402] device (tap69480967-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:30:10 compute-0 ovn_controller[94912]: 2025-09-30T21:30:10Z|00285|binding|INFO|Releasing lport 69480967-0a46-4a33-95ec-9633390ec042 from this chassis (sb_readonly=0)
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 ovn_controller[94912]: 2025-09-30T21:30:10Z|00286|binding|INFO|Setting lport 69480967-0a46-4a33-95ec-9633390ec042 down in Southbound
Sep 30 21:30:10 compute-0 ovn_controller[94912]: 2025-09-30T21:30:10Z|00287|binding|INFO|Removing iface tap69480967-0a ovn-installed in OVS
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.064 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d8:eb 10.100.0.14'], port_security=['fa:16:3e:d1:d8:eb 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '3d704b04-4399-45ea-bc89-6cf74b4dec72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '12', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=69480967-0a46-4a33-95ec-9633390ec042) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.066 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 69480967-0a46-4a33-95ec-9633390ec042 in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.067 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.068 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[91e3d50a-0973-4c33-8559-90f7a8696a76]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.068 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000049.scope: Deactivated successfully.
Sep 30 21:30:10 compute-0 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d00000049.scope: Consumed 3.188s CPU time.
Sep 30 21:30:10 compute-0 systemd-machined[152794]: Machine qemu-36-instance-00000049 terminated.
Sep 30 21:30:10 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [NOTICE]   (231538) : haproxy version is 2.8.14-c23fe91
Sep 30 21:30:10 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [NOTICE]   (231538) : path to executable is /usr/sbin/haproxy
Sep 30 21:30:10 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [WARNING]  (231538) : Exiting Master process...
Sep 30 21:30:10 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [ALERT]    (231538) : Current worker (231540) exited with code 143 (Terminated)
Sep 30 21:30:10 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231534]: [WARNING]  (231538) : All workers exited. Exiting... (0)
Sep 30 21:30:10 compute-0 systemd[1]: libpod-a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3.scope: Deactivated successfully.
Sep 30 21:30:10 compute-0 podman[231574]: 2025-09-30 21:30:10.196735847 +0000 UTC m=+0.044935407 container died a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:30:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3-userdata-shm.mount: Deactivated successfully.
Sep 30 21:30:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-c14a4eae6649a66cc190d15f0f725a7365f54967cfcf0a67af3469b1e0f683fd-merged.mount: Deactivated successfully.
Sep 30 21:30:10 compute-0 podman[231574]: 2025-09-30 21:30:10.245088389 +0000 UTC m=+0.093287949 container cleanup a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 systemd[1]: libpod-conmon-a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3.scope: Deactivated successfully.
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.286 2 INFO nova.virt.libvirt.driver [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Instance destroyed successfully.
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.286 2 DEBUG nova.objects.instance [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 3d704b04-4399-45ea-bc89-6cf74b4dec72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.306 2 DEBUG nova.virt.libvirt.vif [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:29:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-22148059',display_name='tempest-ServerActionsTestJSON-server-22148059',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-22148059',id=73,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-617x3lpp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=3d704b04-4399-45ea-bc89-6cf74b4dec72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.306 2 DEBUG nova.network.os_vif_util [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "69480967-0a46-4a33-95ec-9633390ec042", "address": "fa:16:3e:d1:d8:eb", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69480967-0a", "ovs_interfaceid": "69480967-0a46-4a33-95ec-9633390ec042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.307 2 DEBUG nova.network.os_vif_util [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:10 compute-0 podman[231609]: 2025-09-30 21:30:10.310750018 +0000 UTC m=+0.039445708 container remove a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.307 2 DEBUG os_vif [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.309 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69480967-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.316 2 INFO os_vif [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d8:eb,bridge_name='br-int',has_traffic_filtering=True,id=69480967-0a46-4a33-95ec-9633390ec042,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69480967-0a')
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.317 2 INFO nova.virt.libvirt.driver [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Deleting instance files /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_del
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.319 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcbb54d-4674-420a-854d-22e07b270a7d]: (4, ('Tue Sep 30 09:30:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3)\na7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3\nTue Sep 30 09:30:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (a7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3)\na7f6acbaab1e4178cdaa3f2ad94dc785dde71b593b452cf731649fb3ca803cd3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.320 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b6a958c7-164d-40fd-97da-3d42e459cb36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.321 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.323 2 INFO nova.virt.libvirt.driver [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Deletion of /var/lib/nova/instances/3d704b04-4399-45ea-bc89-6cf74b4dec72_del complete
Sep 30 21:30:10 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.337 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[aadb7a70-97e1-4aa0-9b48-94c98b1a0dae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.363 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[56085846-a907-4781-b5fc-59408d355a4c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.364 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae88aaa-b040-4f64-90b9-ec823f3f60dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.385 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2c79d083-9d3e-4197-a8fb-6289f3deb2a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450247, 'reachable_time': 42611, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231635, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.388 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:30:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:10.388 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[aff519a6-f210-431c-969e-99259045e56b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:10 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.436 2 INFO nova.compute.manager [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.437 2 DEBUG oslo.service.loopingcall [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.438 2 DEBUG nova.compute.manager [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.438 2 DEBUG nova.network.neutron [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.567 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267795.5660326, 326f57c1-51b5-4fdb-ac31-2d754684b734 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.567 2 INFO nova.compute.manager [-] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] VM Stopped (Lifecycle Event)
Sep 30 21:30:10 compute-0 nova_compute[192810]: 2025-09-30 21:30:10.586 2 DEBUG nova.compute.manager [None req-1d4486a9-cc4c-415b-a4c9-8216b47ede1b - - - - - -] [instance: 326f57c1-51b5-4fdb-ac31-2d754684b734] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.368 2 DEBUG nova.network.neutron [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.394 2 INFO nova.compute.manager [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Took 0.96 seconds to deallocate network for instance.
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.469 2 DEBUG nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.470 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.470 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.470 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.470 2 DEBUG nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.470 2 DEBUG nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-unplugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.471 2 DEBUG nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.471 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.471 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.471 2 DEBUG oslo_concurrency.lockutils [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.471 2 DEBUG nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] No waiting events found dispatching network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.472 2 WARNING nova.compute.manager [req-b6a0b7b2-b8ac-46cc-9c97-89189ee36603 req-cde5fbe2-77d9-45f5-84f6-9e4c3345f0b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received unexpected event network-vif-plugged-69480967-0a46-4a33-95ec-9633390ec042 for instance with vm_state active and task_state deleting.
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.485 2 DEBUG nova.compute.manager [req-720a5ca2-b9d0-46f1-96f0-7eacede91aff req-006af92b-1ff5-405b-a12d-90e851d515a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Received event network-vif-deleted-69480967-0a46-4a33-95ec-9633390ec042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.488 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.488 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.547 2 DEBUG nova.compute.provider_tree [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.567 2 DEBUG nova.scheduler.client.report [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.591 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.622 2 INFO nova.scheduler.client.report [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Deleted allocations for instance 3d704b04-4399-45ea-bc89-6cf74b4dec72
Sep 30 21:30:11 compute-0 nova_compute[192810]: 2025-09-30 21:30:11.744 2 DEBUG oslo_concurrency.lockutils [None req-6990096d-80e0-42c5-ae17-ef6a5e884a6b 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "3d704b04-4399-45ea-bc89-6cf74b4dec72" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:12 compute-0 nova_compute[192810]: 2025-09-30 21:30:12.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:12 compute-0 nova_compute[192810]: 2025-09-30 21:30:12.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:15 compute-0 podman[231636]: 2025-09-30 21:30:15.352485786 +0000 UTC m=+0.086088927 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:30:15 compute-0 podman[231637]: 2025-09-30 21:30:15.367491375 +0000 UTC m=+0.088002535 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923)
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.798 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.798 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.816 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.940 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.941 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.950 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:30:15 compute-0 nova_compute[192810]: 2025-09-30 21:30:15.951 2 INFO nova.compute.claims [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.135 2 DEBUG nova.compute.provider_tree [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.150 2 DEBUG nova.scheduler.client.report [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.186 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.187 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.240 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.241 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.260 2 INFO nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.276 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.432 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.433 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.433 2 INFO nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Creating image(s)
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.433 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.434 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.434 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.446 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.492 2 DEBUG nova.policy [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2af578a858a44374a3dc027bbf7c69f2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.500 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.500 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.501 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.516 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.568 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.569 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.605 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.607 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.607 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.660 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.661 2 DEBUG nova.virt.disk.api [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.661 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.711 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.712 2 DEBUG nova.virt.disk.api [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.712 2 DEBUG nova.objects.instance [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.728 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.728 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Ensure instance console log exists: /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.728 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.729 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:16 compute-0 nova_compute[192810]: 2025-09-30 21:30:16.729 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:17 compute-0 nova_compute[192810]: 2025-09-30 21:30:17.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:17 compute-0 podman[231698]: 2025-09-30 21:30:17.313322717 +0000 UTC m=+0.057336300 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:30:17 compute-0 nova_compute[192810]: 2025-09-30 21:30:17.343 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Successfully created port: a4ba92a3-e019-4765-a096-89a660f1932c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:30:17 compute-0 nova_compute[192810]: 2025-09-30 21:30:17.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:18 compute-0 nova_compute[192810]: 2025-09-30 21:30:18.903 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Successfully updated port: a4ba92a3-e019-4765-a096-89a660f1932c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:30:18 compute-0 nova_compute[192810]: 2025-09-30 21:30:18.935 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:18 compute-0 nova_compute[192810]: 2025-09-30 21:30:18.936 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:18 compute-0 nova_compute[192810]: 2025-09-30 21:30:18.936 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:19 compute-0 nova_compute[192810]: 2025-09-30 21:30:19.002 2 DEBUG nova.compute.manager [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-changed-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:19 compute-0 nova_compute[192810]: 2025-09-30 21:30:19.002 2 DEBUG nova.compute.manager [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Refreshing instance network info cache due to event network-changed-a4ba92a3-e019-4765-a096-89a660f1932c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:19 compute-0 nova_compute[192810]: 2025-09-30 21:30:19.003 2 DEBUG oslo_concurrency.lockutils [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:19 compute-0 nova_compute[192810]: 2025-09-30 21:30:19.205 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.248 2 DEBUG nova.network.neutron [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.326 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.326 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.351 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.352 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance network_info: |[{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.352 2 DEBUG oslo_concurrency.lockutils [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.352 2 DEBUG nova.network.neutron [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Refreshing network info cache for port a4ba92a3-e019-4765-a096-89a660f1932c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.354 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start _get_guest_xml network_info=[{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.358 2 WARNING nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.362 2 DEBUG nova.virt.libvirt.host [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.362 2 DEBUG nova.virt.libvirt.host [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.372 2 DEBUG nova.virt.libvirt.host [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.373 2 DEBUG nova.virt.libvirt.host [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.374 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.374 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.375 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.375 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.375 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.375 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.375 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.376 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.376 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.376 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.376 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.377 2 DEBUG nova.virt.hardware [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.379 2 DEBUG nova.virt.libvirt.vif [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.380 2 DEBUG nova.network.os_vif_util [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.380 2 DEBUG nova.network.os_vif_util [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.381 2 DEBUG nova.objects.instance [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.387 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.426 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <uuid>408b1a8f-ed4d-4d93-a98c-564af2bd678d</uuid>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <name>instance-0000004e</name>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestJSON-server-398758596</nova:name>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:30:20</nova:creationTime>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         <nova:port uuid="a4ba92a3-e019-4765-a096-89a660f1932c">
Sep 30 21:30:20 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <system>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="serial">408b1a8f-ed4d-4d93-a98c-564af2bd678d</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="uuid">408b1a8f-ed4d-4d93-a98c-564af2bd678d</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </system>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <os>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </os>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <features>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </features>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:07:78:65"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <target dev="tapa4ba92a3-e0"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/console.log" append="off"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <video>
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </video>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:30:20 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:30:20 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:30:20 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:30:20 compute-0 nova_compute[192810]: </domain>
Sep 30 21:30:20 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.428 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Preparing to wait for external event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.428 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.428 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.428 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.429 2 DEBUG nova.virt.libvirt.vif [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.429 2 DEBUG nova.network.os_vif_util [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.430 2 DEBUG nova.network.os_vif_util [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.430 2 DEBUG os_vif [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.431 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.431 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4ba92a3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.434 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4ba92a3-e0, col_values=(('external_ids', {'iface-id': 'a4ba92a3-e019-4765-a096-89a660f1932c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:78:65', 'vm-uuid': '408b1a8f-ed4d-4d93-a98c-564af2bd678d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:20 compute-0 NetworkManager[51733]: <info>  [1759267820.4374] manager: (tapa4ba92a3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.447 2 INFO os_vif [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0')
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.538 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.539 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.539 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] No VIF found with MAC fa:16:3e:07:78:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.539 2 INFO nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Using config drive
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.579 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.579 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.584 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.585 2 INFO nova.compute.claims [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.851 2 DEBUG nova.compute.provider_tree [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.888 2 DEBUG nova.scheduler.client.report [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.893 2 INFO nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Creating config drive at /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.899 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94f2_3j6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.951 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:20 compute-0 nova_compute[192810]: 2025-09-30 21:30:20.953 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.039 2 DEBUG oslo_concurrency.processutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp94f2_3j6" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.053 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.054 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:30:21 compute-0 kernel: tapa4ba92a3-e0: entered promiscuous mode
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.0939] manager: (tapa4ba92a3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Sep 30 21:30:21 compute-0 ovn_controller[94912]: 2025-09-30T21:30:21Z|00288|binding|INFO|Claiming lport a4ba92a3-e019-4765-a096-89a660f1932c for this chassis.
Sep 30 21:30:21 compute-0 ovn_controller[94912]: 2025-09-30T21:30:21Z|00289|binding|INFO|a4ba92a3-e019-4765-a096-89a660f1932c: Claiming fa:16:3e:07:78:65 10.100.0.4
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.096 2 INFO nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.109 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.110 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.111 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:21 compute-0 ovn_controller[94912]: 2025-09-30T21:30:21Z|00290|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c up in Southbound
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:21 compute-0 ovn_controller[94912]: 2025-09-30T21:30:21Z|00291|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c ovn-installed in OVS
Sep 30 21:30:21 compute-0 systemd-udevd[231735]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.121 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a786da-e427-44f7-9b4e-508def068339]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.122 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.124 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.124 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[759e936a-a909-4d2b-a658-e86c8c47da6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.125 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6bbe27-e484-40cc-b4c9-f92b1d96056f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.1332] device (tapa4ba92a3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.1338] device (tapa4ba92a3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.135 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a07f9585-2a68-470c-b9fa-35fe249f08de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.133 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:30:21 compute-0 systemd-machined[152794]: New machine qemu-37-instance-0000004e.
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.148 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c9bf6ffc-4c1a-4d0d-bca1-5e1c3d158c4f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 systemd[1]: Started Virtual Machine qemu-37-instance-0000004e.
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.174 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[44467f93-6d48-400e-a930-4bbbd294251a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.179 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[196d223b-921f-4e11-9e76-75403cdc9edd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.1800] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Sep 30 21:30:21 compute-0 systemd-udevd[231740]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.210 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1b84000b-a75f-4018-9b71-f54761e938e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.212 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6f05100d-a451-44fe-8f83-7b324c9d931e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.2330] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.237 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ec27b5-18d9-4876-a7b3-0a9ac645fb62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.250 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bacc4be2-0de2-4457-8b2b-14976b237901]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451685, 'reachable_time': 18548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231770, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.263 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8bad7d9a-ec6d-43c9-84a5-d0028962e775]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451685, 'tstamp': 451685}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231771, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.276 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e136f3-db04-42fc-a958-d5383b0278b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451685, 'reachable_time': 18548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231772, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.280 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.281 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.281 2 INFO nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Creating image(s)
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.281 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.282 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.282 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.295 2 DEBUG nova.policy [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.298 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.304 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2792a1-8e21-42db-9720-960ca69c7ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.369 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[585c9305-b758-4d47-b428-3919b408875a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.370 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.370 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.371 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.371 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.371 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.371 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:21 compute-0 NetworkManager[51733]: <info>  [1759267821.3743] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Sep 30 21:30:21 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.377 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:21 compute-0 ovn_controller[94912]: 2025-09-30T21:30:21Z|00292|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.386 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.393 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.393 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[329cfc69-959a-45ee-a854-7a942563cba3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.394 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:21.395 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.411 2 DEBUG nova.compute.manager [req-9ba59b68-eccc-4ee9-afc1-26116e4dfa4e req-183894a4-e6e6-4613-ac63-6860248abe0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.411 2 DEBUG oslo_concurrency.lockutils [req-9ba59b68-eccc-4ee9-afc1-26116e4dfa4e req-183894a4-e6e6-4613-ac63-6860248abe0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.411 2 DEBUG oslo_concurrency.lockutils [req-9ba59b68-eccc-4ee9-afc1-26116e4dfa4e req-183894a4-e6e6-4613-ac63-6860248abe0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.412 2 DEBUG oslo_concurrency.lockutils [req-9ba59b68-eccc-4ee9-afc1-26116e4dfa4e req-183894a4-e6e6-4613-ac63-6860248abe0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.412 2 DEBUG nova.compute.manager [req-9ba59b68-eccc-4ee9-afc1-26116e4dfa4e req-183894a4-e6e6-4613-ac63-6860248abe0f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Processing event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.442 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.443 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.477 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.478 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.479 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.533 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.534 2 DEBUG nova.virt.disk.api [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Checking if we can resize image /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.534 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.587 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.588 2 DEBUG nova.virt.disk.api [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Cannot resize image /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.589 2 DEBUG nova.objects.instance [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'migration_context' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.603 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.604 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Ensure instance console log exists: /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.604 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.604 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.605 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:21 compute-0 podman[231826]: 2025-09-30 21:30:21.722156576 +0000 UTC m=+0.044286511 container create 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.726 2 DEBUG nova.network.neutron [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updated VIF entry in instance network info cache for port a4ba92a3-e019-4765-a096-89a660f1932c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:21 compute-0 nova_compute[192810]: 2025-09-30 21:30:21.727 2 DEBUG nova.network.neutron [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:21 compute-0 systemd[1]: Started libpod-conmon-602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86.scope.
Sep 30 21:30:21 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:30:21 compute-0 podman[231826]: 2025-09-30 21:30:21.698259432 +0000 UTC m=+0.020389397 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9d429bcc8948c4421ece2b6ac76a8848cd463615938373a4802dd21ae8e3451/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:21 compute-0 podman[231826]: 2025-09-30 21:30:21.806286202 +0000 UTC m=+0.128416167 container init 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:30:21 compute-0 podman[231826]: 2025-09-30 21:30:21.812862438 +0000 UTC m=+0.134992383 container start 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:30:21 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [NOTICE]   (231845) : New worker (231847) forked
Sep 30 21:30:21 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [NOTICE]   (231845) : Loading success.
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.060 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267822.060215, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.061 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Started (Lifecycle Event)
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.064 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.072 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.075 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance spawned successfully.
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.076 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:30:22 compute-0 nova_compute[192810]: 2025-09-30 21:30:22.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:25 compute-0 nova_compute[192810]: 2025-09-30 21:30:25.285 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267810.2833223, 3d704b04-4399-45ea-bc89-6cf74b4dec72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:25 compute-0 nova_compute[192810]: 2025-09-30 21:30:25.285 2 INFO nova.compute.manager [-] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] VM Stopped (Lifecycle Event)
Sep 30 21:30:25 compute-0 nova_compute[192810]: 2025-09-30 21:30:25.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:26 compute-0 podman[231856]: 2025-09-30 21:30:26.343527339 +0000 UTC m=+0.067671541 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:30:26 compute-0 podman[231857]: 2025-09-30 21:30:26.361185975 +0000 UTC m=+0.085205124 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:30:27 compute-0 nova_compute[192810]: 2025-09-30 21:30:27.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:27 compute-0 nova_compute[192810]: 2025-09-30 21:30:27.774 2 DEBUG oslo_concurrency.lockutils [req-1dae2988-23ce-4496-aa7f-9de638e86aa5 req-9e6ba6d7-37c9-44d9-8d6a-fa54e20c19b7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.346 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.349 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.513 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.514 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267822.0606785, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.515 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Paused (Lifecycle Event)
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.554 2 DEBUG nova.compute.manager [None req-eb8fcb99-59f5-4f0d-a75b-f5b419e93b0c - - - - - -] [instance: 3d704b04-4399-45ea-bc89-6cf74b4dec72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.556 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.562 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267822.0671017, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.563 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Resumed (Lifecycle Event)
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.567 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.568 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.568 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.569 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.569 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.570 2 DEBUG nova.virt.libvirt.driver [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.609 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.612 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.657 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.723 2 INFO nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Took 14.29 seconds to spawn the instance on the hypervisor.
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.724 2 DEBUG nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.795 2 DEBUG nova.compute.manager [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.796 2 DEBUG oslo_concurrency.lockutils [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.796 2 DEBUG oslo_concurrency.lockutils [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.796 2 DEBUG oslo_concurrency.lockutils [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.797 2 DEBUG nova.compute.manager [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.797 2 WARNING nova.compute.manager [req-20a2828b-3d57-4b56-8c14-57e8f4b525d7 req-420543ef-777d-4077-b52e-0cdbeacf7b0c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state building and task_state spawning.
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.840 2 INFO nova.compute.manager [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Took 14.96 seconds to build instance.
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.862 2 DEBUG oslo_concurrency.lockutils [None req-573661e5-c030-4cd6-98bc-d277f3e42d74 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:30 compute-0 nova_compute[192810]: 2025-09-30 21:30:30.884 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Successfully created port: c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.034 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Successfully updated port: c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.051 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.051 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.051 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.178 2 DEBUG nova.compute.manager [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.179 2 DEBUG nova.compute.manager [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing instance network info cache due to event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.180 2 DEBUG oslo_concurrency.lockutils [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:32 compute-0 nova_compute[192810]: 2025-09-30 21:30:32.250 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.185 2 DEBUG nova.network.neutron [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.731 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.732 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance network_info: |[{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.733 2 DEBUG oslo_concurrency.lockutils [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.733 2 DEBUG nova.network.neutron [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.736 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Start _get_guest_xml network_info=[{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.740 2 WARNING nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.747 2 DEBUG nova.virt.libvirt.host [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.748 2 DEBUG nova.virt.libvirt.host [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.751 2 DEBUG nova.virt.libvirt.host [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.752 2 DEBUG nova.virt.libvirt.host [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.753 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.753 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.754 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.754 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.754 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.754 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.755 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.755 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.755 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.756 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.756 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.756 2 DEBUG nova.virt.hardware [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.759 2 DEBUG nova.virt.libvirt.vif [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:21Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.760 2 DEBUG nova.network.os_vif_util [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.760 2 DEBUG nova.network.os_vif_util [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.761 2 DEBUG nova.objects.instance [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'pci_devices' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.781 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <uuid>c3cd73be-ae82-4c19-8ab7-ec9b06134032</uuid>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <name>instance-00000051</name>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerDiskConfigTestJSON-server-543993129</nova:name>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:30:33</nova:creationTime>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:user uuid="648f7bb37eeb4003825636f9a7c1f92a">tempest-ServerDiskConfigTestJSON-1133643549-project-member</nova:user>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:project uuid="72559935caa44fd9b779b6770f00199f">tempest-ServerDiskConfigTestJSON-1133643549</nova:project>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         <nova:port uuid="c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad">
Sep 30 21:30:33 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <system>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="serial">c3cd73be-ae82-4c19-8ab7-ec9b06134032</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="uuid">c3cd73be-ae82-4c19-8ab7-ec9b06134032</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </system>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <os>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </os>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <features>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </features>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:8a:8e:e0"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <target dev="tapc2d3f3fa-4e"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/console.log" append="off"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <video>
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </video>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:30:33 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:30:33 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:30:33 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:30:33 compute-0 nova_compute[192810]: </domain>
Sep 30 21:30:33 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.782 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Preparing to wait for external event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.783 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.783 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.783 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.784 2 DEBUG nova.virt.libvirt.vif [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:21Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.784 2 DEBUG nova.network.os_vif_util [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.785 2 DEBUG nova.network.os_vif_util [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.785 2 DEBUG os_vif [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.786 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.786 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2d3f3fa-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.791 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2d3f3fa-4e, col_values=(('external_ids', {'iface-id': 'c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:8e:e0', 'vm-uuid': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:33 compute-0 NetworkManager[51733]: <info>  [1759267833.7934] manager: (tapc2d3f3fa-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.800 2 INFO os_vif [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e')
Sep 30 21:30:33 compute-0 ovn_controller[94912]: 2025-09-30T21:30:33Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:07:78:65 10.100.0.4
Sep 30 21:30:33 compute-0 ovn_controller[94912]: 2025-09-30T21:30:33Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:07:78:65 10.100.0.4
Sep 30 21:30:33 compute-0 podman[231919]: 2025-09-30 21:30:33.898701632 +0000 UTC m=+0.068394299 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd)
Sep 30 21:30:33 compute-0 podman[231926]: 2025-09-30 21:30:33.908733766 +0000 UTC m=+0.058793857 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.917 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.917 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.918 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] No VIF found with MAC fa:16:3e:8a:8e:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:30:33 compute-0 nova_compute[192810]: 2025-09-30 21:30:33.918 2 INFO nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Using config drive
Sep 30 21:30:33 compute-0 podman[231920]: 2025-09-30 21:30:33.923890239 +0000 UTC m=+0.086722153 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:30:36 compute-0 unix_chkpwd[231984]: password check failed for user (root)
Sep 30 21:30:36 compute-0 sshd-session[231982]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.672 2 INFO nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Creating config drive at /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.679 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxbg5ewi8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.807 2 DEBUG oslo_concurrency.processutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxbg5ewi8" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:36 compute-0 kernel: tapc2d3f3fa-4e: entered promiscuous mode
Sep 30 21:30:36 compute-0 NetworkManager[51733]: <info>  [1759267836.8617] manager: (tapc2d3f3fa-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/140)
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:36 compute-0 ovn_controller[94912]: 2025-09-30T21:30:36Z|00293|binding|INFO|Claiming lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for this chassis.
Sep 30 21:30:36 compute-0 ovn_controller[94912]: 2025-09-30T21:30:36Z|00294|binding|INFO|c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad: Claiming fa:16:3e:8a:8e:e0 10.100.0.3
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.874 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.875 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e bound to our chassis
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.877 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:36 compute-0 ovn_controller[94912]: 2025-09-30T21:30:36Z|00295|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad ovn-installed in OVS
Sep 30 21:30:36 compute-0 ovn_controller[94912]: 2025-09-30T21:30:36Z|00296|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad up in Southbound
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:36 compute-0 nova_compute[192810]: 2025-09-30 21:30:36.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:36 compute-0 systemd-udevd[232001]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.890 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8a75e1-13c8-4189-abb0-e742e37bc081]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.892 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa145b225-51 in ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.896 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa145b225-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.896 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca209c1-3ff6-4520-b327-ae588bce4480]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.897 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb858ed-d2e6-478f-9773-7c9df463c3c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 NetworkManager[51733]: <info>  [1759267836.9037] device (tapc2d3f3fa-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:36 compute-0 NetworkManager[51733]: <info>  [1759267836.9047] device (tapc2d3f3fa-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.907 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[00f529eb-b5da-40d2-9fa6-5b5757b2fc69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 systemd-machined[152794]: New machine qemu-38-instance-00000051.
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.931 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[47d5a979-253d-4d59-92ff-c65ea4c3179d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 systemd[1]: Started Virtual Machine qemu-38-instance-00000051.
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.957 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f87c0c5d-71f5-4c03-9a01-760e37b3c727]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 systemd-udevd[232007]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.961 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea3bffc-b0bd-49ee-afa4-7267507ec3b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 NetworkManager[51733]: <info>  [1759267836.9631] manager: (tapa145b225-50): new Veth device (/org/freedesktop/NetworkManager/Devices/141)
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.994 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2165caee-2d52-4a9f-8ec5-e944585dfddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:36.997 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[74954530-65bf-4d5b-8f5e-8f5640d06aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 NetworkManager[51733]: <info>  [1759267837.0190] device (tapa145b225-50): carrier: link connected
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.024 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[df760c7e-ffff-4c74-b99b-a4f7b4077624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.040 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fadb7d5e-648a-47b7-a47b-5ca58787b894]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453264, 'reachable_time': 35956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232036, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.054 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[07efce81-e153-4c86-8698-7796907f6d46]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:43a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453264, 'tstamp': 453264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232037, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.069 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[50eaf51a-efb3-4ebc-ace2-cd6d3699f648]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453264, 'reachable_time': 35956, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232038, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.101 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb75b63f-ccf1-4187-98b2-611d11348020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.151 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5e47a049-4bb4-4b05-a97e-7de2751967de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.152 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.153 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.153 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa145b225-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:37 compute-0 NetworkManager[51733]: <info>  [1759267837.1557] manager: (tapa145b225-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:37 compute-0 kernel: tapa145b225-50: entered promiscuous mode
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.158 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa145b225-50, col_values=(('external_ids', {'iface-id': '0f20179e-1a66-48c7-97c7-a3ccb2b25749'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:37 compute-0 ovn_controller[94912]: 2025-09-30T21:30:37Z|00297|binding|INFO|Releasing lport 0f20179e-1a66-48c7-97c7-a3ccb2b25749 from this chassis (sb_readonly=0)
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.172 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.173 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf2b135-81ed-4b43-9c97-d66829fe9f9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.174 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:37.176 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'env', 'PROCESS_TAG=haproxy-a145b225-510f-43a7-8cc6-fccae3ed647e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a145b225-510f-43a7-8cc6-fccae3ed647e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.481 2 DEBUG nova.compute.manager [req-da3e35b1-6cc9-41bb-8383-e19a44ce94d3 req-844397a5-dda5-45c1-b778-f366731ae593 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.481 2 DEBUG oslo_concurrency.lockutils [req-da3e35b1-6cc9-41bb-8383-e19a44ce94d3 req-844397a5-dda5-45c1-b778-f366731ae593 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.481 2 DEBUG oslo_concurrency.lockutils [req-da3e35b1-6cc9-41bb-8383-e19a44ce94d3 req-844397a5-dda5-45c1-b778-f366731ae593 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.481 2 DEBUG oslo_concurrency.lockutils [req-da3e35b1-6cc9-41bb-8383-e19a44ce94d3 req-844397a5-dda5-45c1-b778-f366731ae593 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.482 2 DEBUG nova.compute.manager [req-da3e35b1-6cc9-41bb-8383-e19a44ce94d3 req-844397a5-dda5-45c1-b778-f366731ae593 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Processing event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.512 2 DEBUG nova.compute.manager [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-changed-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.512 2 DEBUG nova.compute.manager [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Refreshing instance network info cache due to event network-changed-a4ba92a3-e019-4765-a096-89a660f1932c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.513 2 DEBUG oslo_concurrency.lockutils [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.513 2 DEBUG oslo_concurrency.lockutils [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.513 2 DEBUG nova.network.neutron [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Refreshing network info cache for port a4ba92a3-e019-4765-a096-89a660f1932c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:30:37 compute-0 podman[232077]: 2025-09-30 21:30:37.522488212 +0000 UTC m=+0.056141870 container create a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:30:37 compute-0 systemd[1]: Started libpod-conmon-a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1.scope.
Sep 30 21:30:37 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:30:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0f85283081545c8f34240ccb646dd6dc16291aa55451faae4d04c5e5ba2b1c8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:37 compute-0 podman[232077]: 2025-09-30 21:30:37.491867058 +0000 UTC m=+0.025520746 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:37 compute-0 podman[232077]: 2025-09-30 21:30:37.598827961 +0000 UTC m=+0.132481719 container init a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:30:37 compute-0 podman[232077]: 2025-09-30 21:30:37.604132065 +0000 UTC m=+0.137785763 container start a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:30:37 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [NOTICE]   (232097) : New worker (232099) forked
Sep 30 21:30:37 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [NOTICE]   (232097) : Loading success.
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.674 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.675 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267837.673693, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.675 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Started (Lifecycle Event)
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.680 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.688 2 INFO nova.virt.libvirt.driver [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance spawned successfully.
Sep 30 21:30:37 compute-0 nova_compute[192810]: 2025-09-30 21:30:37.689 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:30:37 compute-0 sshd-session[231982]: Failed password for root from 45.81.23.80 port 55106 ssh2
Sep 30 21:30:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:38.736 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:38.741 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:38.742 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:38 compute-0 nova_compute[192810]: 2025-09-30 21:30:38.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.300 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.305 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.662 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.663 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267837.6739001, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.663 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Paused (Lifecycle Event)
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.668 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.668 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.668 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.669 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.669 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.669 2 DEBUG nova.virt.libvirt.driver [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.694 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.697 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267837.680716, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.697 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Resumed (Lifecycle Event)
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.731 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.736 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.743 2 INFO nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Took 18.46 seconds to spawn the instance on the hypervisor.
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.744 2 DEBUG nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.774 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.834 2 INFO nova.compute.manager [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Took 19.31 seconds to build instance.
Sep 30 21:30:39 compute-0 sshd-session[231982]: Received disconnect from 45.81.23.80 port 55106:11: Bye Bye [preauth]
Sep 30 21:30:39 compute-0 sshd-session[231982]: Disconnected from authenticating user root 45.81.23.80 port 55106 [preauth]
Sep 30 21:30:39 compute-0 nova_compute[192810]: 2025-09-30 21:30:39.868 2 DEBUG oslo_concurrency.lockutils [None req-1f146ce2-1be7-4013-8cc4-19e5de8f901c 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.542s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.825 2 DEBUG oslo_concurrency.lockutils [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.826 2 DEBUG oslo_concurrency.lockutils [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.826 2 DEBUG nova.compute.manager [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.832 2 DEBUG nova.compute.manager [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.835 2 DEBUG nova.objects.instance [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.848 2 DEBUG nova.network.neutron [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updated VIF entry in instance network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.849 2 DEBUG nova.network.neutron [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.924 2 DEBUG nova.objects.instance [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'info_cache' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:40 compute-0 nova_compute[192810]: 2025-09-30 21:30:40.930 2 DEBUG oslo_concurrency.lockutils [req-acadfa7e-d1d3-456c-ab35-29632a560654 req-04da2625-994f-4f54-b394-e9653f8a833d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.061 2 DEBUG nova.compute.manager [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.062 2 DEBUG oslo_concurrency.lockutils [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.062 2 DEBUG oslo_concurrency.lockutils [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.062 2 DEBUG oslo_concurrency.lockutils [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.062 2 DEBUG nova.compute.manager [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.063 2 WARNING nova.compute.manager [req-69fbd67a-9aa2-4b7e-bb99-2c60125a49f4 req-0131a044-4e4f-4ce2-920a-15028b6ea3fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state None.
Sep 30 21:30:41 compute-0 nova_compute[192810]: 2025-09-30 21:30:41.079 2 DEBUG nova.virt.libvirt.driver [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:30:42 compute-0 nova_compute[192810]: 2025-09-30 21:30:42.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:42 compute-0 nova_compute[192810]: 2025-09-30 21:30:42.588 2 DEBUG nova.network.neutron [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updated VIF entry in instance network info cache for port a4ba92a3-e019-4765-a096-89a660f1932c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:30:42 compute-0 nova_compute[192810]: 2025-09-30 21:30:42.588 2 DEBUG nova.network.neutron [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:42 compute-0 nova_compute[192810]: 2025-09-30 21:30:42.622 2 DEBUG oslo_concurrency.lockutils [req-3a03046a-93cb-48b8-9bf1-f3cef3d1df9a req-2484fa86-e406-4e75-bcdf-c0c740720212 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:42 compute-0 nova_compute[192810]: 2025-09-30 21:30:42.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:43 compute-0 kernel: tapa4ba92a3-e0 (unregistering): left promiscuous mode
Sep 30 21:30:43 compute-0 NetworkManager[51733]: <info>  [1759267843.2376] device (tapa4ba92a3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 ovn_controller[94912]: 2025-09-30T21:30:43Z|00298|binding|INFO|Releasing lport a4ba92a3-e019-4765-a096-89a660f1932c from this chassis (sb_readonly=0)
Sep 30 21:30:43 compute-0 ovn_controller[94912]: 2025-09-30T21:30:43Z|00299|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c down in Southbound
Sep 30 21:30:43 compute-0 ovn_controller[94912]: 2025-09-30T21:30:43Z|00300|binding|INFO|Removing iface tapa4ba92a3-e0 ovn-installed in OVS
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.271 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.273 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.275 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.277 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9d18c218-d125-494b-9759-21a82ec293f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.278 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:30:43 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Sep 30 21:30:43 compute-0 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000004e.scope: Consumed 12.916s CPU time.
Sep 30 21:30:43 compute-0 systemd-machined[152794]: Machine qemu-37-instance-0000004e terminated.
Sep 30 21:30:43 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [NOTICE]   (231845) : haproxy version is 2.8.14-c23fe91
Sep 30 21:30:43 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [NOTICE]   (231845) : path to executable is /usr/sbin/haproxy
Sep 30 21:30:43 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [WARNING]  (231845) : Exiting Master process...
Sep 30 21:30:43 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [ALERT]    (231845) : Current worker (231847) exited with code 143 (Terminated)
Sep 30 21:30:43 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[231841]: [WARNING]  (231845) : All workers exited. Exiting... (0)
Sep 30 21:30:43 compute-0 systemd[1]: libpod-602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86.scope: Deactivated successfully.
Sep 30 21:30:43 compute-0 podman[232131]: 2025-09-30 21:30:43.429407906 +0000 UTC m=+0.051730889 container died 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:30:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86-userdata-shm.mount: Deactivated successfully.
Sep 30 21:30:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-c9d429bcc8948c4421ece2b6ac76a8848cd463615938373a4802dd21ae8e3451-merged.mount: Deactivated successfully.
Sep 30 21:30:43 compute-0 podman[232131]: 2025-09-30 21:30:43.466844842 +0000 UTC m=+0.089167805 container cleanup 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:30:43 compute-0 systemd[1]: libpod-conmon-602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86.scope: Deactivated successfully.
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 podman[232165]: 2025-09-30 21:30:43.555167584 +0000 UTC m=+0.048879576 container remove 602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.562 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[526a47cf-7dfd-4d9f-a476-ea5ffb5888c6]: (4, ('Tue Sep 30 09:30:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86)\n602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86\nTue Sep 30 09:30:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86)\n602ca3529f0c37590ba704daa718db092051d7d2df4cf480bb304dd0c64b1d86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.564 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e82cb6ea-34e3-4e33-a570-26cee6989702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.565 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.586 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[de1a8138-b823-47c1-b571-f09a731ac73a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.613 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[13fe0f5b-b78e-424a-b33e-79798c8b6fd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.614 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3570f36f-3470-4fdc-9b4b-2bc9d25d87ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.632 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dfefa559-3a37-490d-bc4e-802270ac0758]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451679, 'reachable_time': 35547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232196, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.635 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:30:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:43.635 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b71c9269-9e63-4b22-8d35-9c9fa11bb673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:43 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.908 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000051', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '72559935caa44fd9b779b6770f00199f', 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'hostId': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'name': 'tempest-ServerActionsTestJSON-server-398758596', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000004e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '2af578a858a44374a3dc027bbf7c69f2', 'user_id': '22ed16bd4ffe4ef8bb21968a857066a1', 'hostId': 'c29ffc71cc596b4549b6636be1edd7fd7781f162454a909dea63d863', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.912 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.912 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>]
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.935 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.936 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.937 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02870c07-71c6-4046-84c7-fb39d472cfe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:43.912863', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b887e262-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '1bc032006b6163f8022cd5479581f2187f9809d71bed6d97334e88b4d442b49a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:43.912863', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b887fa18-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': 'c033b12f6f202adbe06b4ae0e419ec5971e0c85c9f6ec3ef7e52451c24d9b4d7'}]}, 'timestamp': '2025-09-30 21:30:43.938121', '_unique_id': '4dd1b135e4ea4b3b94f59398fc28b6a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.939 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.941 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.956 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.957 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.957 2 DEBUG nova.compute.manager [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.958 2 DEBUG oslo_concurrency.lockutils [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.958 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.959 2 DEBUG oslo_concurrency.lockutils [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.960 2 DEBUG oslo_concurrency.lockutils [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.960 2 DEBUG nova.compute.manager [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:43 compute-0 nova_compute[192810]: 2025-09-30 21:30:43.961 2 WARNING nova.compute.manager [req-84e3afe3-cdd4-4a56-9b1b-21e6699f096d req-b2e004aa-e1f1-43f0-a648-761119abff6c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state active and task_state powering-off.
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13ae2550-25c8-4b68-bdca-b1280090e569', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:43.941843', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b88b0ca8-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': '9622f19b3550086fe6c5a576be4cb47ca1cd303217bb03f06c17c7726769784a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:43.941843', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b88b2b0c-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': '9cb21171260597e12d9b67334b740c661fbdfd517c5572bc7899196f9ac9eb71'}]}, 'timestamp': '2025-09-30 21:30:43.959267', '_unique_id': 'ec892ef1de8f4779917f2ff2a543cd84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.960 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.970 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c3cd73be-ae82-4c19-8ab7-ec9b06134032 / tapc2d3f3fa-4e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.970 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.971 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '800aa83b-4414-41ad-9da6-44a1b391106e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:43.963578', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b88d2a24-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': 'a8796548a895df74d7080cd3b1518c175bfdfa8f17ccbee1297acd6bde17c9bb'}]}, 'timestamp': '2025-09-30 21:30:43.972135', '_unique_id': 'ef9df88ee5614b9a991a778a3fdaf2b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.974 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.978 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa65c084-6fd0-4887-9a34-5129e2927bf0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:43.974691', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b88dc722-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': 'cb52f8776bc939effcddeb07a5e377b0598cd8b3651577bcd7f9137e35c88739'}]}, 'timestamp': '2025-09-30 21:30:43.979202', '_unique_id': 'cc518880d33f4d22996f23baa2986ef9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.981 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.981 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.982 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.986 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e215d50-cd7d-4d83-8f59-0743e9d93583', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:43.981715', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b88edf90-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': 'df9232d1cee2d870409c329f28b4bf2b2dd03a37c04c37bc8033459e34fe9820'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:43.981715', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b88eec56-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': '5f585f6a9fe2ec19c36e13d4d402a7855dccc30ba099023419a43d11c5050698'}]}, 'timestamp': '2025-09-30 21:30:43.986454', '_unique_id': '06010c93d101439d980d9a6dba1b1428'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.988 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.989 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.993 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb704d5-3b33-47a7-8046-cc6cf3abe24a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:43.988675', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b88fe926-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '43f114a7e61005f3e2f6d4885de71a0feb23295e25d7ac32297d8c4af6aace7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:43.988675', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b88ff6c8-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': 'ac536c161874db40bca9d1c8558b7b42b6f66e3cd708c4eb78ffd4fe3dabc82d'}]}, 'timestamp': '2025-09-30 21:30:43.993528', '_unique_id': 'ae829bf89e4944698279aa2ebccaad80'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.995 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.995 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.997 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fca9f61-2e02-4f46-9e65-8e8946efeb92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:43.995625', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b890f60e-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': '866a54176124832ded2f7136d5cd10465a537d2ebaf2dcaadb5d61cfc850bbf1'}]}, 'timestamp': '2025-09-30 21:30:43.997713', '_unique_id': '619f824aceb24517ae411c00899375a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.999 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:30:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:43.999 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b516324-28b6-4549-95e6-f2e6a1a8ab8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:43.999195', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b89180ec-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': 'dfa818ce775f8182906544619a73ff486abc684391293bcc0fd0b704c4403a2b'}]}, 'timestamp': '2025-09-30 21:30:44.000231', '_unique_id': '9e48b5572f264703ab9bb8f940c3bd23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.001 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.016 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/cpu volume: 6080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.017 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5fb36a1-d1bd-4483-980a-dbe1d9cc25d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6080000000, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'timestamp': '2025-09-30T21:30:44.001824', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b8941da2-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.703535744, 'message_signature': '53ef28f689a768261db63a6b0f32a497ff16e11ba79006fa6d69e8ec162d209b'}]}, 'timestamp': '2025-09-30 21:30:44.017509', '_unique_id': 'a82168044b244688a08e53b8239b7d39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.019 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.019 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>]
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.019 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.019 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.020 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91c2845b-b869-4dca-9472-7f94ba5de1d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.019759', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b894a3ee-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': 'd603211912a81ef635305bcaa58f3bf7f6abb6f627b82e106e841e36867e58ec'}]}, 'timestamp': '2025-09-30 21:30:44.020710', '_unique_id': '178c427e19a248758e7d9c0e80c583a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.021 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.022 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.022 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.022 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.023 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60dbec6b-154e-44e1-8133-b1c5e940dbca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:44.022223', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b895053c-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': 'bb5cf5daffd59951a51031b09736b5a9f3bad4ce5160cd4ae01311cc69190435'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:44.022223', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b895102c-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': 'b1cb45bda3918ec54e974272ee7e91c1e75d715bf4ce87785a483a5bcb085c7c'}]}, 'timestamp': '2025-09-30 21:30:44.023549', '_unique_id': '51e290283cb94d73871a2279016cf68a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.025 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.025 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a7581e9-5522-48b1-bfd3-e7d30825f39f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.025070', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b8957382-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': '770ff343a8b8ea50856d2584758ca7a2412a3c3c4469f554d4df54dbba14405b'}]}, 'timestamp': '2025-09-30 21:30:44.026027', '_unique_id': 'e936391b8b6046899a5391751a568432'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.026 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>]
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.027 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.028 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.028 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2ee71dd-a2ff-40dc-83e6-121cff7dc6ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:44.027871', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b895e074-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '058504e4d63d7b7d88259d73b40130d9ca6c7b30e06d35b05e512cadadf4508c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:44.027871', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b895ea2e-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': 'fff556e2f6dfadc81a6fe9ff35f866700a66706c27626d7957657d317bb2f65b'}]}, 'timestamp': '2025-09-30 21:30:44.029012', '_unique_id': '1a3eeb360e604e6d851731411178efbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.029 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.030 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.030 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.latency volume: 505192245 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.030 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.latency volume: 4671198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.031 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7d83366-5328-4a4a-8cb8-e0178dc8a763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 505192245, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:44.030402', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8964406-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '5ad7a921f48b3b1692f1b9e6acf3c50005c499259910403691c9a8e3ed3b16ac'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4671198, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:44.030402', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8964d34-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '2ce1b4d095dda97c2bf9a18d8a05d6af402716d6fa4430086dcdf7a155787adb'}]}, 'timestamp': '2025-09-30 21:30:44.031517', '_unique_id': 'e91b817fd03a4b57a6c85d575a458411'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.032 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.033 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.033 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24e04c27-223b-4199-8156-71628c383460', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.033049', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b896ab76-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': '930e4049f047ba437991036d6b38361a8776af2d3da7653eaf22825dcccf1988'}]}, 'timestamp': '2025-09-30 21:30:44.034069', '_unique_id': '0fed6cf41339496da1b21ee85b13eec5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.034 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.035 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.035 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.035 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.036 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '025c7c15-2b89-46ba-9994-4ec5e03ff496', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:44.035503', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8970b48-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '625a9272698aa864ef8360affb5597a4c37e20b2dcafca50e458e718a6513f7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:44.035503', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b8971462-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.600348636, 'message_signature': '6237b3acf4abcc7880ce81c04410c9dff1a8c200d4e7f6b9acdb0d2977aad2ea'}]}, 'timestamp': '2025-09-30 21:30:44.036755', '_unique_id': 'eb973a4cac0f4711bf6034d42174df45'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.038 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.038 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerDiskConfigTestJSON-server-543993129>, <NovaLikeServer: tempest-ServerActionsTestJSON-server-398758596>]
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.038 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.038 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '302df5d9-4755-4cb1-a3ad-fe787c7bfb8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.038630', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b89784e2-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': 'db480b34da160ea2b7b8e160689a6a06f55a560e9582409a608c622b0fb76279'}]}, 'timestamp': '2025-09-30 21:30:44.039457', '_unique_id': 'c91751c8aabc46848380f920981f4fa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.040 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.041 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad248a5-6c1d-4e53-b02f-ad9a2a987119', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.040913', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b897de06-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': '0242700357d2823319d7e200eb49613aa45c48b03fb71e2e05e34e18bed0c1d9'}]}, 'timestamp': '2025-09-30 21:30:44.041751', '_unique_id': 'e2c797737b774eda9342a66fcc7f930a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.043 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.043 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6837438d-c653-4df5-acf5-9efc73aeb7ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'instance-00000051-c3cd73be-ae82-4c19-8ab7-ec9b06134032-tapc2d3f3fa-4e', 'timestamp': '2025-09-30T21:30:44.043226', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'tapc2d3f3fa-4e', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:8e:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2d3f3fa-4e'}, 'message_id': 'b898386a-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.651124939, 'message_signature': '478a0091603865891cacde61cde5eaf9a14817c385c09ce6de06aa6c78ff38de'}]}, 'timestamp': '2025-09-30 21:30:44.044095', '_unique_id': '667a07e792714c28971e16fcf71b9278'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.045 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.045 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.045 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.046 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8781500-4bc2-4831-8c58-df73e87f88bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-vda', 'timestamp': '2025-09-30T21:30:44.045653', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b8989742-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': '2a5ec2444d4490769aaf71e6567834df8a12f139355c971b3b8e766ef98c6519'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '648f7bb37eeb4003825636f9a7c1f92a', 'user_name': None, 'project_id': '72559935caa44fd9b779b6770f00199f', 'project_name': None, 'resource_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032-sda', 'timestamp': '2025-09-30T21:30:44.045653', 'resource_metadata': {'display_name': 'tempest-ServerDiskConfigTestJSON-server-543993129', 'name': 'instance-00000051', 'instance_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'instance_type': 'm1.nano', 'host': 'b4220d6e0d402c78f756d282be96947c3d4b60aff41bc94c1773ddcf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b898a0ca-9e44-11f0-a153-fa163e09b122', 'monotonic_time': 4539.629362299, 'message_signature': 'de41b7754c985482cd744fb2615bc8edf7a6eb5b1476000e3b0edbc25391fd14'}]}, 'timestamp': '2025-09-30 21:30:44.046684', '_unique_id': '9c637e91d2654df2919461493c2d08a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.048 12 DEBUG ceilometer.compute.pollsters [-] c3cd73be-ae82-4c19-8ab7-ec9b06134032/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.048 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance c3cd73be-ae82-4c19-8ab7-ec9b06134032: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:30:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:30:44.049 12 DEBUG ceilometer.compute.pollsters [-] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000004e, id=408b1a8f-ed4d-4d93-a98c-564af2bd678d>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.095 2 INFO nova.virt.libvirt.driver [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance shutdown successfully after 3 seconds.
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.100 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance destroyed successfully.
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.100 2 DEBUG nova.objects.instance [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.116 2 DEBUG nova.compute.manager [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.253 2 DEBUG oslo_concurrency.lockutils [None req-4d0a7fbe-5f86-4b97-a3c4-1ca244745203 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.427s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:44 compute-0 nova_compute[192810]: 2025-09-30 21:30:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:45.162 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:45.163 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:30:45 compute-0 nova_compute[192810]: 2025-09-30 21:30:45.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.124 2 DEBUG nova.compute.manager [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.125 2 DEBUG oslo_concurrency.lockutils [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.125 2 DEBUG oslo_concurrency.lockutils [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.125 2 DEBUG oslo_concurrency.lockutils [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.126 2 DEBUG nova.compute.manager [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.126 2 WARNING nova.compute.manager [req-3f24d37f-2f02-4801-a4ab-ca6ba0fa9e7c req-fa164fc4-9c37-4145-a712-9b9e4f09fc57 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state stopped and task_state None.
Sep 30 21:30:46 compute-0 podman[232198]: 2025-09-30 21:30:46.331564896 +0000 UTC m=+0.057616807 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:30:46 compute-0 podman[232197]: 2025-09-30 21:30:46.366670523 +0000 UTC m=+0.097475444 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:46 compute-0 nova_compute[192810]: 2025-09-30 21:30:46.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:30:47 compute-0 nova_compute[192810]: 2025-09-30 21:30:47.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:48 compute-0 podman[232243]: 2025-09-30 21:30:48.355833469 +0000 UTC m=+0.087158974 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.879 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.880 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.880 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.880 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.974 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:48 compute-0 nova_compute[192810]: 2025-09-30 21:30:48.983 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.013 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'info_cache' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.046 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.046 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.048 2 DEBUG nova.network.neutron [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.067 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.067 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.132 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.137 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.194 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.195 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.287 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.425 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.426 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5505MB free_disk=73.26266860961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.426 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.426 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.524 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating resource usage from migration 4741a102-4774-4853-b0df-0b90788b2a9d
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.564 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Migration 4741a102-4774-4853-b0df-0b90788b2a9d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.635 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.658 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.711 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.711 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.939 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.940 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:49 compute-0 nova_compute[192810]: 2025-09-30 21:30:49.940 2 DEBUG nova.network.neutron [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:30:50 compute-0 nova_compute[192810]: 2025-09-30 21:30:50.712 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:50 compute-0 nova_compute[192810]: 2025-09-30 21:30:50.713 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:30:50 compute-0 nova_compute[192810]: 2025-09-30 21:30:50.713 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:30:50 compute-0 nova_compute[192810]: 2025-09-30 21:30:50.764 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:30:50 compute-0 ovn_controller[94912]: 2025-09-30T21:30:50Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:8e:e0 10.100.0.3
Sep 30 21:30:50 compute-0 ovn_controller[94912]: 2025-09-30T21:30:50Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:8e:e0 10.100.0.3
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.798 2 DEBUG nova.network.neutron [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.828 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.830 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.830 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.830 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.862 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance destroyed successfully.
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.863 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.875 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.886 2 DEBUG nova.virt.libvirt.vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.887 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.887 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.888 2 DEBUG os_vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.890 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4ba92a3-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.896 2 INFO os_vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0')
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.901 2 DEBUG nova.virt.libvirt.driver [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start _get_guest_xml network_info=[{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.905 2 WARNING nova.virt.libvirt.driver [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.912 2 DEBUG nova.virt.libvirt.host [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.913 2 DEBUG nova.virt.libvirt.host [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.919 2 DEBUG nova.virt.libvirt.host [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.920 2 DEBUG nova.virt.libvirt.host [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.921 2 DEBUG nova.virt.libvirt.driver [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.921 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.921 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.921 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.922 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.922 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.922 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.922 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.922 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.923 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.923 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.923 2 DEBUG nova.virt.hardware [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.923 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:52 compute-0 nova_compute[192810]: 2025-09-30 21:30:52.947 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.001 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.002 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.002 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.003 2 DEBUG oslo_concurrency.lockutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.005 2 DEBUG nova.virt.libvirt.vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.005 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.006 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.006 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.023 2 DEBUG nova.virt.libvirt.driver [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <uuid>408b1a8f-ed4d-4d93-a98c-564af2bd678d</uuid>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <name>instance-0000004e</name>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestJSON-server-398758596</nova:name>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:30:52</nova:creationTime>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:user uuid="22ed16bd4ffe4ef8bb21968a857066a1">tempest-ServerActionsTestJSON-1867667353-project-member</nova:user>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:project uuid="2af578a858a44374a3dc027bbf7c69f2">tempest-ServerActionsTestJSON-1867667353</nova:project>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         <nova:port uuid="a4ba92a3-e019-4765-a096-89a660f1932c">
Sep 30 21:30:53 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <system>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="serial">408b1a8f-ed4d-4d93-a98c-564af2bd678d</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="uuid">408b1a8f-ed4d-4d93-a98c-564af2bd678d</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </system>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <os>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </os>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <features>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </features>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk.config"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:07:78:65"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <target dev="tapa4ba92a3-e0"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/console.log" append="off"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <video>
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </video>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:30:53 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:30:53 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:30:53 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:30:53 compute-0 nova_compute[192810]: </domain>
Sep 30 21:30:53 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.024 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.081 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.082 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.140 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.142 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.160 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.215 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.216 2 DEBUG nova.virt.disk.api [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Checking if we can resize image /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.216 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.309 2 DEBUG oslo_concurrency.processutils [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.310 2 DEBUG nova.virt.disk.api [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Cannot resize image /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.311 2 DEBUG nova.objects.instance [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'migration_context' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.326 2 DEBUG nova.virt.libvirt.vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:30:44Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.327 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.328 2 DEBUG nova.network.os_vif_util [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.328 2 DEBUG os_vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.330 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.330 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4ba92a3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.334 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4ba92a3-e0, col_values=(('external_ids', {'iface-id': 'a4ba92a3-e019-4765-a096-89a660f1932c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:78:65', 'vm-uuid': '408b1a8f-ed4d-4d93-a98c-564af2bd678d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.3365] manager: (tapa4ba92a3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.342 2 INFO os_vif [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0')
Sep 30 21:30:53 compute-0 kernel: tapa4ba92a3-e0: entered promiscuous mode
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.4078] manager: (tapa4ba92a3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Sep 30 21:30:53 compute-0 ovn_controller[94912]: 2025-09-30T21:30:53Z|00301|binding|INFO|Claiming lport a4ba92a3-e019-4765-a096-89a660f1932c for this chassis.
Sep 30 21:30:53 compute-0 ovn_controller[94912]: 2025-09-30T21:30:53Z|00302|binding|INFO|a4ba92a3-e019-4765-a096-89a660f1932c: Claiming fa:16:3e:07:78:65 10.100.0.4
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.416 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.417 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.418 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:53 compute-0 ovn_controller[94912]: 2025-09-30T21:30:53Z|00303|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c ovn-installed in OVS
Sep 30 21:30:53 compute-0 ovn_controller[94912]: 2025-09-30T21:30:53Z|00304|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c up in Southbound
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.429 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[06d536fd-46e8-4d5c-b275-21f0b6ad447d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.430 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.433 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.433 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[32906aa3-44e3-491b-b5af-958ae730df4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.434 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[45d518cf-6f54-4b81-87c6-bf23de6d26c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 systemd-udevd[232322]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.448 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d4210b54-c5ee-43f8-aa9d-a9aed2260a04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 systemd-machined[152794]: New machine qemu-39-instance-0000004e.
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.462 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cfa559e8-dcab-4f96-9b69-016fc99b70e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.4643] device (tapa4ba92a3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.4651] device (tapa4ba92a3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:30:53 compute-0 systemd[1]: Started Virtual Machine qemu-39-instance-0000004e.
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.490 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[50bcda56-d9d8-4297-926c-fbff101b7f51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.495 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa7ffc62-8e51-40cb-b69e-4b72a272e493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 systemd-udevd[232325]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.5022] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/145)
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.527 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[27f30a89-c1cc-4194-865f-91d7d8ae599a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.530 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4583de7c-7398-441a-b229-ee2b74ac6be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.5512] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.557 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccba771-1c06-4917-8537-1349c6b31138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.576 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[13111702-7a16-4fd9-a8ed-df50fce7922a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454917, 'reachable_time': 44526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232353, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.590 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[91c326b7-3504-4d39-8cc4-7baa25c6807b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 454917, 'tstamp': 454917}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232354, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.607 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3174ae57-1430-4e22-9a1c-0382599b9ffc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 91], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454917, 'reachable_time': 44526, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232355, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.636 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[49624e56-3ab9-4f7b-9e46-6bbe45f3706c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.691 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e16b0d4e-3ac5-458a-8270-b440b210c4b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.693 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.693 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.693 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 NetworkManager[51733]: <info>  [1759267853.6957] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.697 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 ovn_controller[94912]: 2025-09-30T21:30:53Z|00305|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:30:53 compute-0 nova_compute[192810]: 2025-09-30 21:30:53.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.710 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.710 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b9651c99-4588-4f90-a91a-60db9b633888]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.711 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:30:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:53.712 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:30:54 compute-0 podman[232394]: 2025-09-30 21:30:54.069725984 +0000 UTC m=+0.048458896 container create 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:30:54 compute-0 systemd[1]: Started libpod-conmon-6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca.scope.
Sep 30 21:30:54 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:30:54 compute-0 podman[232394]: 2025-09-30 21:30:54.043615575 +0000 UTC m=+0.022348487 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:30:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/041d679a0a40966d6c2c90c0c203ea0c5a5618b70ffd9f86ca78d9b35c77cdab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.147 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 408b1a8f-ed4d-4d93-a98c-564af2bd678d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.148 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267854.147295, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.148 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Resumed (Lifecycle Event)
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.150 2 DEBUG nova.compute.manager [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.153 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance rebooted successfully.
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.153 2 DEBUG nova.compute.manager [None req-c3ca9518-bba0-4107-8e1d-9f9b3ef6d409 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:54 compute-0 podman[232394]: 2025-09-30 21:30:54.156710293 +0000 UTC m=+0.135443225 container init 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:30:54 compute-0 podman[232394]: 2025-09-30 21:30:54.161879024 +0000 UTC m=+0.140611936 container start 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:30:54 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [NOTICE]   (232413) : New worker (232415) forked
Sep 30 21:30:54 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [NOTICE]   (232413) : Loading success.
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.201 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.204 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.250 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267854.147463, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.251 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Started (Lifecycle Event)
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.271 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.274 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.836 2 DEBUG nova.network.neutron [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:54 compute-0 nova_compute[192810]: 2025-09-30 21:30:54.891 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:55.165 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.285 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.285 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Creating file /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/248cca7b43324927b4001d764729ded4.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.286 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/248cca7b43324927b4001d764729ded4.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.822 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/248cca7b43324927b4001d764729ded4.tmp" returned: 1 in 0.536s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.822 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/248cca7b43324927b4001d764729ded4.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.823 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Creating directory /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.823 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.963 2 DEBUG nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.963 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.964 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.964 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.964 2 DEBUG nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.964 2 WARNING nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state active and task_state None.
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.964 2 DEBUG nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.965 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.965 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.965 2 DEBUG oslo_concurrency.lockutils [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.965 2 DEBUG nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:30:56 compute-0 nova_compute[192810]: 2025-09-30 21:30:56.965 2 WARNING nova.compute.manager [req-494ac31a-fec9-4497-a96a-065ed1eff9da req-5d62f88b-3642-4141-82e2-85f11fcd128a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state active and task_state None.
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.048 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032" returned: 0 in 0.225s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.051 2 DEBUG nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:57 compute-0 podman[232427]: 2025-09-30 21:30:57.314690569 +0000 UTC m=+0.051501423 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:30:57 compute-0 podman[232428]: 2025-09-30 21:30:57.317398157 +0000 UTC m=+0.053495753 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc.)
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.392 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.433 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.434 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.434 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:57 compute-0 nova_compute[192810]: 2025-09-30 21:30:57.434 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:30:58 compute-0 nova_compute[192810]: 2025-09-30 21:30:58.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 kernel: tapc2d3f3fa-4e (unregistering): left promiscuous mode
Sep 30 21:30:59 compute-0 NetworkManager[51733]: <info>  [1759267859.2439] device (tapc2d3f3fa-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00306|binding|INFO|Releasing lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad from this chassis (sb_readonly=0)
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00307|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad down in Southbound
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00308|binding|INFO|Removing iface tapc2d3f3fa-4e ovn-installed in OVS
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000051.scope: Deactivated successfully.
Sep 30 21:30:59 compute-0 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d00000051.scope: Consumed 12.909s CPU time.
Sep 30 21:30:59 compute-0 systemd-machined[152794]: Machine qemu-38-instance-00000051 terminated.
Sep 30 21:30:59 compute-0 kernel: tapc2d3f3fa-4e: entered promiscuous mode
Sep 30 21:30:59 compute-0 systemd-udevd[232473]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.479 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.481 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e unbound from our chassis
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.482 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a145b225-510f-43a7-8cc6-fccae3ed647e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:30:59 compute-0 NetworkManager[51733]: <info>  [1759267859.4836] manager: (tapc2d3f3fa-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.483 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cab4b615-b93f-41da-a66e-3d41ecf0a11d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.483 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace which is not needed anymore
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 kernel: tapc2d3f3fa-4e (unregistering): left promiscuous mode
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00309|binding|INFO|Claiming lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for this chassis.
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00310|binding|INFO|c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad: Claiming fa:16:3e:8a:8e:e0 10.100.0.3
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00311|binding|INFO|Setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad ovn-installed in OVS
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00312|if_status|INFO|Dropped 1 log messages in last 514 seconds (most recently, 514 seconds ago) due to excessive rate
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00313|if_status|INFO|Not setting lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad down as sb is readonly
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [NOTICE]   (232097) : haproxy version is 2.8.14-c23fe91
Sep 30 21:30:59 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [NOTICE]   (232097) : path to executable is /usr/sbin/haproxy
Sep 30 21:30:59 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [WARNING]  (232097) : Exiting Master process...
Sep 30 21:30:59 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [ALERT]    (232097) : Current worker (232099) exited with code 143 (Terminated)
Sep 30 21:30:59 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232093]: [WARNING]  (232097) : All workers exited. Exiting... (0)
Sep 30 21:30:59 compute-0 systemd[1]: libpod-a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1.scope: Deactivated successfully.
Sep 30 21:30:59 compute-0 podman[232502]: 2025-09-30 21:30:59.624011556 +0000 UTC m=+0.043487140 container died a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:30:59 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1-userdata-shm.mount: Deactivated successfully.
Sep 30 21:30:59 compute-0 systemd[1]: var-lib-containers-storage-overlay-f0f85283081545c8f34240ccb646dd6dc16291aa55451faae4d04c5e5ba2b1c8-merged.mount: Deactivated successfully.
Sep 30 21:30:59 compute-0 podman[232502]: 2025-09-30 21:30:59.668336696 +0000 UTC m=+0.087812280 container cleanup a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:30:59 compute-0 systemd[1]: libpod-conmon-a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1.scope: Deactivated successfully.
Sep 30 21:30:59 compute-0 podman[232532]: 2025-09-30 21:30:59.732111148 +0000 UTC m=+0.043076860 container remove a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:30:59 compute-0 ovn_controller[94912]: 2025-09-30T21:30:59Z|00314|binding|INFO|Releasing lport c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad from this chassis (sb_readonly=0)
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.739 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.742 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fe92c77d-c3ed-4264-92ed-88c880c1d8f0]: (4, ('Tue Sep 30 09:30:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1)\na478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1\nTue Sep 30 09:30:59 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (a478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1)\na478e7f45e03a2a3eefdbecde6903f2e9ce3a85e0290efffd9abc086d4b828f1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.743 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[972aefa1-296a-4bcb-b34e-bb505c640bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.744 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 kernel: tapa145b225-50: left promiscuous mode
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 nova_compute[192810]: 2025-09-30 21:30:59.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.808 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa4f6f6-5a6d-414a-a7d6-9393618f103a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.842 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe6c3ad-fcde-477a-a6b8-d83d365a92ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.843 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1352ab-ddba-4c48-91c1-a57d85418cfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7d4f8f-06b7-465b-b90e-87e498c05bc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453257, 'reachable_time': 24855, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232552, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 systemd[1]: run-netns-ovnmeta\x2da145b225\x2d510f\x2d43a7\x2d8cc6\x2dfccae3ed647e.mount: Deactivated successfully.
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.860 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.860 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e7d250-357e-42a9-9d65-cfe6118844ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.861 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e bound to our chassis
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.862 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.873 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7a681172-2782-4526-b7c3-2da02a020102]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.874 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa145b225-51 in ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.876 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa145b225-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.876 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bff25b-db16-4dc9-bb56-a7c6d6ba6dd1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.876 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6adeb8-2948-4fc7-bb0c-82a7c78084b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.886 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[36890db4-179e-4483-bf91-5aee303cf879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.908 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7997afa1-d349-42bb-82fb-12a9d7cbff12]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.942 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ab074b70-7882-4ff8-ad03-149754c6a445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.947 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[30ad3ad6-d6b2-464a-bfa8-de085fdeb130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 NetworkManager[51733]: <info>  [1759267859.9500] manager: (tapa145b225-50): new Veth device (/org/freedesktop/NetworkManager/Devices/148)
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.961 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:8e:e0 10.100.0.3'], port_security=['fa:16:3e:8a:8e:e0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'c3cd73be-ae82-4c19-8ab7-ec9b06134032', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a145b225-510f-43a7-8cc6-fccae3ed647e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '72559935caa44fd9b779b6770f00199f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a098193-23af-4fd8-a818-c9a9c1a46706', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cf4fe4d7-2316-4403-a18b-3b0227898f0d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.987 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[47543bbb-d8d8-4c1b-a9a5-b987087211c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:30:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:30:59.990 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb07f22-0773-4716-b74b-a1372bf54d2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 NetworkManager[51733]: <info>  [1759267860.0224] device (tapa145b225-50): carrier: link connected
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.030 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9336af8b-612d-45dd-944d-4d42f7837166]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.048 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8cb52a3b-7532-4379-aac7-3413a06c3f69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455564, 'reachable_time': 24290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232579, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.068 2 INFO nova.virt.libvirt.driver [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance shutdown successfully after 3 seconds.
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.070 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[53dbf4f1-c75d-401e-a66c-c69183555e91]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe69:43a1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 455564, 'tstamp': 455564}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232580, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.075 2 INFO nova.virt.libvirt.driver [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Instance destroyed successfully.
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.077 2 DEBUG nova.virt.libvirt.vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:30:49Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.077 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "vif_mac": "fa:16:3e:8a:8e:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.078 2 DEBUG nova.network.os_vif_util [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.079 2 DEBUG os_vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.081 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2d3f3fa-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.087 2 INFO os_vif [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e')
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.089 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[93a96748-e600-4342-a562-ff54afcbec32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa145b225-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:69:43:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455564, 'reachable_time': 24290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232581, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.091 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.124 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cbfdf2-f94c-45d1-a190-0ac23daff131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.174 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.176 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.192 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[479a621f-30cd-48db-bb2b-1da7669f9d9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.200 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.200 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.201 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa145b225-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 kernel: tapa145b225-50: entered promiscuous mode
Sep 30 21:31:00 compute-0 NetworkManager[51733]: <info>  [1759267860.2034] manager: (tapa145b225-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.206 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa145b225-50, col_values=(('external_ids', {'iface-id': '0f20179e-1a66-48c7-97c7-a3ccb2b25749'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 ovn_controller[94912]: 2025-09-30T21:31:00Z|00315|binding|INFO|Releasing lport 0f20179e-1a66-48c7-97c7-a3ccb2b25749 from this chassis (sb_readonly=0)
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.222 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3144781a-bd2a-4d07-b3c5-fe76426fc12f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.224 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/a145b225-510f-43a7-8cc6-fccae3ed647e.pid.haproxy
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID a145b225-510f-43a7-8cc6-fccae3ed647e
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.224 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'env', 'PROCESS_TAG=haproxy-a145b225-510f-43a7-8cc6-fccae3ed647e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a145b225-510f-43a7-8cc6-fccae3ed647e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.260 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.261 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Copying file /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk to 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.261 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:00 compute-0 podman[232621]: 2025-09-30 21:31:00.583609829 +0000 UTC m=+0.060889160 container create b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:31:00 compute-0 systemd[1]: Started libpod-conmon-b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec.scope.
Sep 30 21:31:00 compute-0 podman[232621]: 2025-09-30 21:31:00.54762973 +0000 UTC m=+0.024909121 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:31:00 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:31:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/984736239f0d027d166fe2b9957ed34fb763e92ff44b32ab9b1791101c60a12c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:31:00 compute-0 podman[232621]: 2025-09-30 21:31:00.693707742 +0000 UTC m=+0.170987093 container init b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:00 compute-0 podman[232621]: 2025-09-30 21:31:00.699286053 +0000 UTC m=+0.176565384 container start b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [NOTICE]   (232640) : New worker (232642) forked
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [NOTICE]   (232640) : Loading success.
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.756 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad in datapath a145b225-510f-43a7-8cc6-fccae3ed647e unbound from our chassis
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.757 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a145b225-510f-43a7-8cc6-fccae3ed647e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.758 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c0cdae36-b312-4568-bae3-f9b597567931]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:00.759 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e namespace which is not needed anymore
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.816 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "scp -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk" returned: 0 in 0.555s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.817 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Copying file /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:31:00 compute-0 nova_compute[192810]: 2025-09-30 21:31:00.818 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.config 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [NOTICE]   (232640) : haproxy version is 2.8.14-c23fe91
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [NOTICE]   (232640) : path to executable is /usr/sbin/haproxy
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [WARNING]  (232640) : Exiting Master process...
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [ALERT]    (232640) : Current worker (232642) exited with code 143 (Terminated)
Sep 30 21:31:00 compute-0 neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e[232636]: [WARNING]  (232640) : All workers exited. Exiting... (0)
Sep 30 21:31:00 compute-0 systemd[1]: libpod-b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec.scope: Deactivated successfully.
Sep 30 21:31:00 compute-0 podman[232669]: 2025-09-30 21:31:00.877491167 +0000 UTC m=+0.044291320 container died b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec-userdata-shm.mount: Deactivated successfully.
Sep 30 21:31:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-984736239f0d027d166fe2b9957ed34fb763e92ff44b32ab9b1791101c60a12c-merged.mount: Deactivated successfully.
Sep 30 21:31:00 compute-0 podman[232669]: 2025-09-30 21:31:00.959738636 +0000 UTC m=+0.126538769 container cleanup b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:31:00 compute-0 systemd[1]: libpod-conmon-b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec.scope: Deactivated successfully.
Sep 30 21:31:01 compute-0 podman[232700]: 2025-09-30 21:31:01.041515733 +0000 UTC m=+0.065117777 container remove b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.047 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6ecbf07b-d54d-47e4-9dd8-e31b7aaba15c]: (4, ('Tue Sep 30 09:31:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec)\nb15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec\nTue Sep 30 09:31:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e (b15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec)\nb15f24149284354a54365ad65c96ce49235a5fbca7bc75f65164a7ba40f774ec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.049 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5d75f438-2d4f-4f29-a974-3fe2f73315b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.051 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa145b225-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:01 compute-0 kernel: tapa145b225-50: left promiscuous mode
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.067 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "scp -C -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.config 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.config" returned: 0 in 0.249s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.067 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Copying file /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.068 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.info 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.080 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f937435d-fbe7-4eeb-8efa-5bd77a0e8d81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.104 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d89f8b8-c58b-45f9-ba7b-627e6781abf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.105 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[494819d1-9f21-41d8-ac1a-0d7051a6cf61]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.120 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3764edc1-12da-494f-8e5e-e76501936626]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 455556, 'reachable_time': 31096, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232718, 'error': None, 'target': 'ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 systemd[1]: run-netns-ovnmeta\x2da145b225\x2d510f\x2d43a7\x2d8cc6\x2dfccae3ed647e.mount: Deactivated successfully.
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.125 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a145b225-510f-43a7-8cc6-fccae3ed647e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:31:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:01.125 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[6a433026-c1f7-4a39-971a-28c9a03dc41e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.288 2 DEBUG oslo_concurrency.processutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] CMD "scp -C -r /var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032_resize/disk.info 192.168.122.101:/var/lib/nova/instances/c3cd73be-ae82-4c19-8ab7-ec9b06134032/disk.info" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.502 2 DEBUG nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.503 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.503 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.503 2 DEBUG oslo_concurrency.lockutils [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.503 2 DEBUG nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.504 2 WARNING nova.compute.manager [req-798abc8d-9e8b-4a6c-8dc7-f4e91f40bd1e req-7784293f-925f-4df7-8a1b-2ebefe33e4f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-unplugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_migrating.
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.585 2 DEBUG neutronclient.v2_0.client [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.601 2 DEBUG nova.objects.instance [None req-259eb02f-0b7e-4b88-9ded-822e5d38be6f 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.621 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267861.6210313, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.621 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Paused (Lifecycle Event)
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.671 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.675 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.718 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.790 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.790 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:01 compute-0 nova_compute[192810]: 2025-09-30 21:31:01.790 2 DEBUG oslo_concurrency.lockutils [None req-8421c264-34bc-4012-bed3-16126fc76d88 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:02 compute-0 kernel: tapa4ba92a3-e0 (unregistering): left promiscuous mode
Sep 30 21:31:02 compute-0 NetworkManager[51733]: <info>  [1759267862.1079] device (tapa4ba92a3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 ovn_controller[94912]: 2025-09-30T21:31:02Z|00316|binding|INFO|Releasing lport a4ba92a3-e019-4765-a096-89a660f1932c from this chassis (sb_readonly=0)
Sep 30 21:31:02 compute-0 ovn_controller[94912]: 2025-09-30T21:31:02Z|00317|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c down in Southbound
Sep 30 21:31:02 compute-0 ovn_controller[94912]: 2025-09-30T21:31:02Z|00318|binding|INFO|Removing iface tapa4ba92a3-e0 ovn-installed in OVS
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.135 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.137 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.138 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.139 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[590219a7-968e-41b7-ab41-e520068c3bd6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.139 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:31:02 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Sep 30 21:31:02 compute-0 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d0000004e.scope: Consumed 8.388s CPU time.
Sep 30 21:31:02 compute-0 systemd-machined[152794]: Machine qemu-39-instance-0000004e terminated.
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [NOTICE]   (232413) : haproxy version is 2.8.14-c23fe91
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [NOTICE]   (232413) : path to executable is /usr/sbin/haproxy
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [WARNING]  (232413) : Exiting Master process...
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [WARNING]  (232413) : Exiting Master process...
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [ALERT]    (232413) : Current worker (232415) exited with code 143 (Terminated)
Sep 30 21:31:02 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232409]: [WARNING]  (232413) : All workers exited. Exiting... (0)
Sep 30 21:31:02 compute-0 systemd[1]: libpod-6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca.scope: Deactivated successfully.
Sep 30 21:31:02 compute-0 podman[232744]: 2025-09-30 21:31:02.282757805 +0000 UTC m=+0.057351081 container died 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:31:02 compute-0 NetworkManager[51733]: <info>  [1759267862.3105] manager: (tapa4ba92a3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Sep 30 21:31:02 compute-0 kernel: tapa4ba92a3-e0: entered promiscuous mode
Sep 30 21:31:02 compute-0 kernel: tapa4ba92a3-e0 (unregistering): left promiscuous mode
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca-userdata-shm.mount: Deactivated successfully.
Sep 30 21:31:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-041d679a0a40966d6c2c90c0c203ea0c5a5618b70ffd9f86ca78d9b35c77cdab-merged.mount: Deactivated successfully.
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.358 2 DEBUG nova.compute.manager [None req-259eb02f-0b7e-4b88-9ded-822e5d38be6f 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:02 compute-0 podman[232744]: 2025-09-30 21:31:02.369063416 +0000 UTC m=+0.143656692 container cleanup 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:31:02 compute-0 systemd[1]: libpod-conmon-6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca.scope: Deactivated successfully.
Sep 30 21:31:02 compute-0 podman[232789]: 2025-09-30 21:31:02.496797415 +0000 UTC m=+0.100317677 container remove 6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.503 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[894d98aa-23bb-4e1b-b28e-8b1e1ae31cb8]: (4, ('Tue Sep 30 09:31:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca)\n6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca\nTue Sep 30 09:31:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca)\n6f2506df022d1f587dffbc5554a0008f639fe556cfd212667b5bc7c820dae6ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.506 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9db8a254-119d-4908-aa0b-bb29be6b7bdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.507 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.525 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[34e4f7bc-491e-4aff-93a8-12bc2a998924]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.554 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f5819eb8-8a65-4fdd-9c57-3578418bbad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.555 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6c44ce0c-2149-4f7f-98ee-2a0b07a301cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.569 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0800b080-2a0a-49ce-947a-a085b6b4c84b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 454910, 'reachable_time': 32713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232809, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.572 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:31:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:02.572 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[bca683ab-f9f2-49e6-a57b-c0c5fcbe61b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:02 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.945 2 DEBUG nova.compute.manager [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.945 2 DEBUG oslo_concurrency.lockutils [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.946 2 DEBUG oslo_concurrency.lockutils [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.946 2 DEBUG oslo_concurrency.lockutils [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.947 2 DEBUG nova.compute.manager [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:02 compute-0 nova_compute[192810]: 2025-09-30 21:31:02.947 2 WARNING nova.compute.manager [req-67cad2f7-b822-4cd2-839d-e0bdfe226268 req-8133d674-0b09-4d90-8a9c-482e4719e870 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state suspended and task_state None.
Sep 30 21:31:04 compute-0 podman[232812]: 2025-09-30 21:31:04.317368148 +0000 UTC m=+0.049884642 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:31:04 compute-0 podman[232811]: 2025-09-30 21:31:04.322350634 +0000 UTC m=+0.057563696 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:31:04 compute-0 podman[232810]: 2025-09-30 21:31:04.323207926 +0000 UTC m=+0.059477785 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.402 2 DEBUG nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.402 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.402 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.403 2 DEBUG oslo_concurrency.lockutils [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.403 2 DEBUG nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:04 compute-0 nova_compute[192810]: 2025-09-30 21:31:04.403 2 WARNING nova.compute.manager [req-a5adcd54-40dd-443f-a74c-25658b5032ab req-38d6870c-9276-4ac4-bf78-3ec499bf6cd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_migrated.
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.206 2 DEBUG nova.compute.manager [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.206 2 DEBUG oslo_concurrency.lockutils [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.206 2 DEBUG oslo_concurrency.lockutils [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.206 2 DEBUG oslo_concurrency.lockutils [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.207 2 DEBUG nova.compute.manager [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.207 2 WARNING nova.compute.manager [req-db2e6f66-04b1-4a64-bf9e-43345a68bf2c req-f35d50d2-49e0-49d3-857e-313530151076 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state suspended and task_state None.
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.948 2 INFO nova.compute.manager [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Resuming
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.949 2 DEBUG nova.objects.instance [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'flavor' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.990 2 DEBUG oslo_concurrency.lockutils [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.991 2 DEBUG oslo_concurrency.lockutils [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquired lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:05 compute-0 nova_compute[192810]: 2025-09-30 21:31:05.991 2 DEBUG nova.network.neutron [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:31:06 compute-0 nova_compute[192810]: 2025-09-30 21:31:06.538 2 DEBUG nova.compute.manager [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:06 compute-0 nova_compute[192810]: 2025-09-30 21:31:06.538 2 DEBUG nova.compute.manager [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing instance network info cache due to event network-changed-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:31:06 compute-0 nova_compute[192810]: 2025-09-30 21:31:06.539 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:06 compute-0 nova_compute[192810]: 2025-09-30 21:31:06.539 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:06 compute-0 nova_compute[192810]: 2025-09-30 21:31:06.539 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Refreshing network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.757 2 DEBUG nova.network.neutron [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [{"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.780 2 DEBUG oslo_concurrency.lockutils [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Releasing lock "refresh_cache-408b1a8f-ed4d-4d93-a98c-564af2bd678d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.784 2 DEBUG nova.virt.libvirt.vif [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:31:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.784 2 DEBUG nova.network.os_vif_util [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.785 2 DEBUG nova.network.os_vif_util [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.785 2 DEBUG os_vif [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.786 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4ba92a3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4ba92a3-e0, col_values=(('external_ids', {'iface-id': 'a4ba92a3-e019-4765-a096-89a660f1932c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:78:65', 'vm-uuid': '408b1a8f-ed4d-4d93-a98c-564af2bd678d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.790 2 INFO os_vif [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0')
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.816 2 DEBUG nova.objects.instance [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'numa_topology' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:07 compute-0 kernel: tapa4ba92a3-e0: entered promiscuous mode
Sep 30 21:31:07 compute-0 NetworkManager[51733]: <info>  [1759267867.8878] manager: (tapa4ba92a3-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/151)
Sep 30 21:31:07 compute-0 systemd-udevd[232886]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-0 ovn_controller[94912]: 2025-09-30T21:31:07Z|00319|binding|INFO|Claiming lport a4ba92a3-e019-4765-a096-89a660f1932c for this chassis.
Sep 30 21:31:07 compute-0 ovn_controller[94912]: 2025-09-30T21:31:07Z|00320|binding|INFO|a4ba92a3-e019-4765-a096-89a660f1932c: Claiming fa:16:3e:07:78:65 10.100.0.4
Sep 30 21:31:07 compute-0 NetworkManager[51733]: <info>  [1759267867.9480] device (tapa4ba92a3-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:31:07 compute-0 NetworkManager[51733]: <info>  [1759267867.9494] device (tapa4ba92a3-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.947 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '7', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.948 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 bound to our chassis
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.949 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:31:07 compute-0 ovn_controller[94912]: 2025-09-30T21:31:07Z|00321|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c ovn-installed in OVS
Sep 30 21:31:07 compute-0 ovn_controller[94912]: 2025-09-30T21:31:07Z|00322|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c up in Southbound
Sep 30 21:31:07 compute-0 nova_compute[192810]: 2025-09-30 21:31:07.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.960 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[52aea3d1-21e9-4344-ba0b-8db3fa54523e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.961 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9692dd1-61 in ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.963 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9692dd1-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.963 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[781705b1-625f-4e39-be7f-4b55db0b9d16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-0 systemd-machined[152794]: New machine qemu-40-instance-0000004e.
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.964 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f8574d95-53f8-487e-bc74-d8a6f4bbb006]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.973 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5ff3d1bd-9e6a-41fe-9278-1e6141a159c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:07 compute-0 systemd[1]: Started Virtual Machine qemu-40-instance-0000004e.
Sep 30 21:31:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:07.997 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5f947197-ee14-4f88-a9f9-e5368a95eca8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.022 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4eda4681-4ed7-4f64-80ce-b4e8da8df4c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.028 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9db3815f-9ab7-43be-8ace-b93a11e1e885]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 NetworkManager[51733]: <info>  [1759267868.0311] manager: (tapf9692dd1-60): new Veth device (/org/freedesktop/NetworkManager/Devices/152)
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.061 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecd1635-6da1-48fd-afd7-d62c758264cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.063 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbfb5b5-5ec3-4f49-9b7c-a99ccd3975b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 NetworkManager[51733]: <info>  [1759267868.0830] device (tapf9692dd1-60): carrier: link connected
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.087 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e5cacf1e-7a80-421e-bc50-7d7a2012314a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.101 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e28b9f69-cc3c-47e1-9103-1a1e276b2759]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456370, 'reachable_time': 36412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232922, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.115 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[344610aa-ffe3-4a9c-9468-bd9bd667fe7e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed1:7870'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456370, 'tstamp': 456370}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232923, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.128 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6a35abcf-308d-4bf5-aa5e-e2aa426747c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9692dd1-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d1:78:70'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456370, 'reachable_time': 36412, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232924, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.157 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8f07c3c9-baf0-43b5-86c3-f97a10708fb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.207 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[14e77d32-732a-4d67-a798-08de2670e272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.209 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.209 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.209 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9692dd1-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-0 NetworkManager[51733]: <info>  [1759267868.2119] manager: (tapf9692dd1-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-0 kernel: tapf9692dd1-60: entered promiscuous mode
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.216 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9692dd1-60, col_values=(('external_ids', {'iface-id': 'a71d0422-57d0-42fa-887d-fdcb57295fce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-0 ovn_controller[94912]: 2025-09-30T21:31:08Z|00323|binding|INFO|Releasing lport a71d0422-57d0-42fa-887d-fdcb57295fce from this chassis (sb_readonly=0)
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.234 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.235 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3e931506-3b06-4195-a7e7-6c992533cc94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.236 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f9692dd1-658f-4c07-943c-6bc662046dc4.pid.haproxy
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f9692dd1-658f-4c07-943c-6bc662046dc4
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:31:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:08.238 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'env', 'PROCESS_TAG=haproxy-f9692dd1-658f-4c07-943c-6bc662046dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9692dd1-658f-4c07-943c-6bc662046dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.253 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updated VIF entry in instance network info cache for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.254 2 DEBUG nova.network.neutron [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.294 2 DEBUG oslo_concurrency.lockutils [req-8ed4f990-ba21-465f-a602-d0e829d91a0f req-31869dde-428f-4b50-8f67-ef0065562346 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.343 2 DEBUG nova.compute.manager [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.344 2 DEBUG oslo_concurrency.lockutils [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.344 2 DEBUG oslo_concurrency.lockutils [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.344 2 DEBUG oslo_concurrency.lockutils [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.344 2 DEBUG nova.compute.manager [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.344 2 WARNING nova.compute.manager [req-36b786e6-a389-4168-b5f9-1ef7a3e1e39c req-4fc20f9a-8dfb-4f23-b5de-67aeefd64b2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state suspended and task_state resuming.
Sep 30 21:31:08 compute-0 podman[232963]: 2025-09-30 21:31:08.568216587 +0000 UTC m=+0.045391729 container create dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:08 compute-0 systemd[1]: Started libpod-conmon-dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8.scope.
Sep 30 21:31:08 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:31:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7ecb2903dc4514168a22a5015309bcf0cf80b6b152adc9252657eb881b5aaa4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:31:08 compute-0 podman[232963]: 2025-09-30 21:31:08.542276811 +0000 UTC m=+0.019451983 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:31:08 compute-0 podman[232963]: 2025-09-30 21:31:08.646843764 +0000 UTC m=+0.124018916 container init dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:31:08 compute-0 podman[232963]: 2025-09-30 21:31:08.652716393 +0000 UTC m=+0.129891535 container start dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:08 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [NOTICE]   (232982) : New worker (232984) forked
Sep 30 21:31:08 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [NOTICE]   (232982) : Loading success.
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.695 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.696 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.696 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.696 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.696 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.697 2 WARNING nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_finish.
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.697 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.697 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.697 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.697 2 DEBUG oslo_concurrency.lockutils [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.698 2 DEBUG nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] No waiting events found dispatching network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.698 2 WARNING nova.compute.manager [req-f8606941-c79e-4a01-8448-7c3a4f0cb01d req-c871c9c9-8d6d-4aea-915b-1cf1dd094fac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Received unexpected event network-vif-plugged-c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for instance with vm_state active and task_state resize_finish.
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.836 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 408b1a8f-ed4d-4d93-a98c-564af2bd678d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.837 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267868.8364985, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.837 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Started (Lifecycle Event)
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.858 2 DEBUG nova.compute.manager [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.858 2 DEBUG nova.objects.instance [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.862 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.866 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.873 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance running successfully.
Sep 30 21:31:08 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.875 2 DEBUG nova.virt.libvirt.guest [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.875 2 DEBUG nova.compute.manager [None req-15984835-8e93-4bc9-b61a-57431fef8377 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.902 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.902 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267868.8437054, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.902 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Resumed (Lifecycle Event)
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.936 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:08 compute-0 nova_compute[192810]: 2025-09-30 21:31:08.939 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.454 2 DEBUG nova.compute.manager [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.454 2 DEBUG oslo_concurrency.lockutils [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.455 2 DEBUG oslo_concurrency.lockutils [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.455 2 DEBUG oslo_concurrency.lockutils [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.455 2 DEBUG nova.compute.manager [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:10 compute-0 nova_compute[192810]: 2025-09-30 21:31:10.455 2 WARNING nova.compute.manager [req-71cc8251-b88d-4452-ada9-91d5a595a5c0 req-d80ecc16-87f2-4211-a782-474d29fcc035 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state active and task_state None.
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.240 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.241 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.241 2 DEBUG nova.compute.manager [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Going to confirm migration 15 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.333 2 DEBUG nova.objects.instance [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'info_cache' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.802 2 DEBUG neutronclient.v2_0.client [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.803 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.803 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquired lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:31:11 compute-0 nova_compute[192810]: 2025-09-30 21:31:11.803 2 DEBUG nova.network.neutron [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.571 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.572 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.572 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.572 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.572 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.586 2 INFO nova.compute.manager [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Terminating instance
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.601 2 DEBUG nova.compute.manager [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:31:12 compute-0 kernel: tapa4ba92a3-e0 (unregistering): left promiscuous mode
Sep 30 21:31:12 compute-0 NetworkManager[51733]: <info>  [1759267872.6269] device (tapa4ba92a3-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:31:12 compute-0 ovn_controller[94912]: 2025-09-30T21:31:12Z|00324|binding|INFO|Releasing lport a4ba92a3-e019-4765-a096-89a660f1932c from this chassis (sb_readonly=0)
Sep 30 21:31:12 compute-0 ovn_controller[94912]: 2025-09-30T21:31:12Z|00325|binding|INFO|Setting lport a4ba92a3-e019-4765-a096-89a660f1932c down in Southbound
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 ovn_controller[94912]: 2025-09-30T21:31:12Z|00326|binding|INFO|Removing iface tapa4ba92a3-e0 ovn-installed in OVS
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.651 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:78:65 10.100.0.4'], port_security=['fa:16:3e:07:78:65 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '408b1a8f-ed4d-4d93-a98c-564af2bd678d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9692dd1-658f-4c07-943c-6bc662046dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2af578a858a44374a3dc027bbf7c69f2', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5518a7d3-faed-4617-b7cb-cfdf96df8ee0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.177', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a290e6b7-09a2-435f-ae19-df4a5ccfc2d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a4ba92a3-e019-4765-a096-89a660f1932c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.652 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a4ba92a3-e019-4765-a096-89a660f1932c in datapath f9692dd1-658f-4c07-943c-6bc662046dc4 unbound from our chassis
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.653 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9692dd1-658f-4c07-943c-6bc662046dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.654 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a2118891-f04f-4f03-8e58-c5654fd969e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.656 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 namespace which is not needed anymore
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004e.scope: Deactivated successfully.
Sep 30 21:31:12 compute-0 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d0000004e.scope: Consumed 4.328s CPU time.
Sep 30 21:31:12 compute-0 systemd-machined[152794]: Machine qemu-40-instance-0000004e terminated.
Sep 30 21:31:12 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [NOTICE]   (232982) : haproxy version is 2.8.14-c23fe91
Sep 30 21:31:12 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [NOTICE]   (232982) : path to executable is /usr/sbin/haproxy
Sep 30 21:31:12 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [ALERT]    (232982) : Current worker (232984) exited with code 143 (Terminated)
Sep 30 21:31:12 compute-0 neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4[232978]: [WARNING]  (232982) : All workers exited. Exiting... (0)
Sep 30 21:31:12 compute-0 systemd[1]: libpod-dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8.scope: Deactivated successfully.
Sep 30 21:31:12 compute-0 podman[233017]: 2025-09-30 21:31:12.813857783 +0000 UTC m=+0.055081953 container died dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.862 2 INFO nova.virt.libvirt.driver [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Instance destroyed successfully.
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.863 2 DEBUG nova.objects.instance [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lazy-loading 'resources' on Instance uuid 408b1a8f-ed4d-4d93-a98c-564af2bd678d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:31:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-c7ecb2903dc4514168a22a5015309bcf0cf80b6b152adc9252657eb881b5aaa4-merged.mount: Deactivated successfully.
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.882 2 DEBUG nova.virt.libvirt.vif [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-398758596',display_name='tempest-ServerActionsTestJSON-server-398758596',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-398758596',id=78,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBADj3eZ6JfWn1sD61WsBF2lWMwpE7XLjMHeX5D51ZTuvFj593BvRFZjp02OuEwvTUJEH79lLLcgJlYP5+6PE14q16iBV+2oZvdFvdVW4CAPM3S7plfjHeuzOdoE0D4V+KA==',key_name='tempest-keypair-557988176',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:30:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2af578a858a44374a3dc027bbf7c69f2',ramdisk_id='',reservation_id='r-yikggmvz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1867667353',owner_user_name='tempest-ServerActionsTestJSON-1867667353-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:31:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='22ed16bd4ffe4ef8bb21968a857066a1',uuid=408b1a8f-ed4d-4d93-a98c-564af2bd678d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.883 2 DEBUG nova.network.os_vif_util [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converting VIF {"id": "a4ba92a3-e019-4765-a096-89a660f1932c", "address": "fa:16:3e:07:78:65", "network": {"id": "f9692dd1-658f-4c07-943c-6bc662046dc4", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-253023348-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2af578a858a44374a3dc027bbf7c69f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4ba92a3-e0", "ovs_interfaceid": "a4ba92a3-e019-4765-a096-89a660f1932c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:12 compute-0 podman[233017]: 2025-09-30 21:31:12.883520754 +0000 UTC m=+0.124744924 container cleanup dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.884 2 DEBUG nova.network.os_vif_util [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.885 2 DEBUG os_vif [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.889 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4ba92a3-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 systemd[1]: libpod-conmon-dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8.scope: Deactivated successfully.
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.894 2 INFO os_vif [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:78:65,bridge_name='br-int',has_traffic_filtering=True,id=a4ba92a3-e019-4765-a096-89a660f1932c,network=Network(f9692dd1-658f-4c07-943c-6bc662046dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4ba92a3-e0')
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.894 2 INFO nova.virt.libvirt.driver [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Deleting instance files /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d_del
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.895 2 INFO nova.virt.libvirt.driver [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Deletion of /var/lib/nova/instances/408b1a8f-ed4d-4d93-a98c-564af2bd678d_del complete
Sep 30 21:31:12 compute-0 podman[233060]: 2025-09-30 21:31:12.978827033 +0000 UTC m=+0.075937090 container remove dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.983 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f5520a46-0895-4b65-8928-539d3206a075]: (4, ('Tue Sep 30 09:31:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8)\ndd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8\nTue Sep 30 09:31:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 (dd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8)\ndd46d29921e0784269279014f458457af0b19b53cdf3692522b5d5e9220ddfd8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.984 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9235cb8c-6f63-48de-8632-222f882c8ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:12.985 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9692dd1-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:12 compute-0 kernel: tapf9692dd1-60: left promiscuous mode
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:12 compute-0 nova_compute[192810]: 2025-09-30 21:31:12.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.001 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[836c8bfe-f3bf-4e35-a7bf-78eba8481acb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.020 2 INFO nova.compute.manager [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.021 2 DEBUG oslo.service.loopingcall [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.021 2 DEBUG nova.compute.manager [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.021 2 DEBUG nova.network.neutron [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.042 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8433a10a-7a27-489e-9583-0c9970aa5609]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.044 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0f363698-d74b-471c-813b-aaf80c065936]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.060 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ed25a6-002a-4123-9079-3a76f9dd86d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456364, 'reachable_time': 34225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233075, 'error': None, 'target': 'ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.062 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9692dd1-658f-4c07-943c-6bc662046dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:31:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:13.062 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[caca7411-87f6-4b10-9f37-454fe0f0f241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:31:13 compute-0 systemd[1]: run-netns-ovnmeta\x2df9692dd1\x2d658f\x2d4c07\x2d943c\x2d6bc662046dc4.mount: Deactivated successfully.
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.274 2 DEBUG nova.compute.manager [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.275 2 DEBUG oslo_concurrency.lockutils [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.275 2 DEBUG oslo_concurrency.lockutils [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.276 2 DEBUG oslo_concurrency.lockutils [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.276 2 DEBUG nova.compute.manager [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:13 compute-0 nova_compute[192810]: 2025-09-30 21:31:13.277 2 DEBUG nova.compute.manager [req-6846c98b-e87f-4f2e-bc28-8a9378e4431b req-573be713-ebab-4b5a-9f6e-511711cdab60 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-unplugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.003 2 DEBUG nova.network.neutron [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Updating instance_info_cache with network_info: [{"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.040 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Releasing lock "refresh_cache-c3cd73be-ae82-4c19-8ab7-ec9b06134032" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.040 2 DEBUG nova.objects.instance [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lazy-loading 'migration_context' on Instance uuid c3cd73be-ae82-4c19-8ab7-ec9b06134032 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.068 2 DEBUG nova.virt.libvirt.vif [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:30:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-543993129',display_name='tempest-ServerDiskConfigTestJSON-server-543993129',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-543993129',id=81,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='72559935caa44fd9b779b6770f00199f',ramdisk_id='',reservation_id='r-z9ljve0a',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-1133643549',owner_user_name='tempest-ServerDiskConfigTestJSON-1133643549-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:31:09Z,user_data=None,user_id='648f7bb37eeb4003825636f9a7c1f92a',uuid=c3cd73be-ae82-4c19-8ab7-ec9b06134032,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.069 2 DEBUG nova.network.os_vif_util [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converting VIF {"id": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "address": "fa:16:3e:8a:8e:e0", "network": {"id": "a145b225-510f-43a7-8cc6-fccae3ed647e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-43539478-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "72559935caa44fd9b779b6770f00199f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2d3f3fa-4e", "ovs_interfaceid": "c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.070 2 DEBUG nova.network.os_vif_util [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.070 2 DEBUG os_vif [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.072 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2d3f3fa-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.072 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.074 2 INFO os_vif [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:8e:e0,bridge_name='br-int',has_traffic_filtering=True,id=c2d3f3fa-4e3d-473d-85fa-3e2e453fd6ad,network=Network(a145b225-510f-43a7-8cc6-fccae3ed647e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2d3f3fa-4e')
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.075 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.075 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.243 2 DEBUG nova.compute.provider_tree [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.269 2 DEBUG nova.scheduler.client.report [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.331 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.535 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267859.5347056, c3cd73be-ae82-4c19-8ab7-ec9b06134032 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.536 2 INFO nova.compute.manager [-] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] VM Stopped (Lifecycle Event)
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.556 2 DEBUG nova.compute.manager [None req-a29b2a16-694c-409e-a1fe-cac8aa6666a8 - - - - - -] [instance: c3cd73be-ae82-4c19-8ab7-ec9b06134032] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.559 2 INFO nova.scheduler.client.report [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Deleted allocation for migration 4741a102-4774-4853-b0df-0b90788b2a9d
Sep 30 21:31:14 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 230589
Sep 30 21:31:14 compute-0 nova_compute[192810]: 2025-09-30 21:31:14.862 2 DEBUG oslo_concurrency.lockutils [None req-89cb21f3-424a-4e0d-b013-f84e83720362 648f7bb37eeb4003825636f9a7c1f92a 72559935caa44fd9b779b6770f00199f - - default default] Lock "c3cd73be-ae82-4c19-8ab7-ec9b06134032" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.029 2 DEBUG nova.network.neutron [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.067 2 INFO nova.compute.manager [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Took 2.05 seconds to deallocate network for instance.
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.194 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.194 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.269 2 DEBUG nova.compute.provider_tree [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.285 2 DEBUG nova.scheduler.client.report [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.316 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.344 2 INFO nova.scheduler.client.report [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Deleted allocations for instance 408b1a8f-ed4d-4d93-a98c-564af2bd678d
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.454 2 DEBUG oslo_concurrency.lockutils [None req-5f7735e4-3e86-486f-9f5f-59301045c509 22ed16bd4ffe4ef8bb21968a857066a1 2af578a858a44374a3dc027bbf7c69f2 - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.539 2 DEBUG nova.compute.manager [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.540 2 DEBUG oslo_concurrency.lockutils [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.540 2 DEBUG oslo_concurrency.lockutils [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.540 2 DEBUG oslo_concurrency.lockutils [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "408b1a8f-ed4d-4d93-a98c-564af2bd678d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.540 2 DEBUG nova.compute.manager [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] No waiting events found dispatching network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.541 2 WARNING nova.compute.manager [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received unexpected event network-vif-plugged-a4ba92a3-e019-4765-a096-89a660f1932c for instance with vm_state deleted and task_state None.
Sep 30 21:31:15 compute-0 nova_compute[192810]: 2025-09-30 21:31:15.541 2 DEBUG nova.compute.manager [req-d0ed1189-b065-4fc1-abae-0c0736aa284c req-57401e3e-249e-4593-9428-82d39e75801f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Received event network-vif-deleted-a4ba92a3-e019-4765-a096-89a660f1932c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:31:17 compute-0 nova_compute[192810]: 2025-09-30 21:31:17.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:17 compute-0 podman[233077]: 2025-09-30 21:31:17.322381984 +0000 UTC m=+0.057331510 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:31:17 compute-0 podman[233076]: 2025-09-30 21:31:17.349454668 +0000 UTC m=+0.085350158 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:31:17 compute-0 nova_compute[192810]: 2025-09-30 21:31:17.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:19 compute-0 podman[233118]: 2025-09-30 21:31:19.310326588 +0000 UTC m=+0.052200280 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:31:21 compute-0 nova_compute[192810]: 2025-09-30 21:31:21.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-0 nova_compute[192810]: 2025-09-30 21:31:22.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-0 nova_compute[192810]: 2025-09-30 21:31:22.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:22 compute-0 nova_compute[192810]: 2025-09-30 21:31:22.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:27 compute-0 nova_compute[192810]: 2025-09-30 21:31:27.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:27 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:53116 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:31:27 compute-0 nova_compute[192810]: 2025-09-30 21:31:27.858 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267872.8568947, 408b1a8f-ed4d-4d93-a98c-564af2bd678d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:31:27 compute-0 nova_compute[192810]: 2025-09-30 21:31:27.858 2 INFO nova.compute.manager [-] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] VM Stopped (Lifecycle Event)
Sep 30 21:31:27 compute-0 nova_compute[192810]: 2025-09-30 21:31:27.881 2 DEBUG nova.compute.manager [None req-9a5e62a7-56a2-47b1-9301-a51f83ce3c53 - - - - - -] [instance: 408b1a8f-ed4d-4d93-a98c-564af2bd678d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:31:27 compute-0 nova_compute[192810]: 2025-09-30 21:31:27.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:28 compute-0 podman[233140]: 2025-09-30 21:31:28.317647483 +0000 UTC m=+0.052653441 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public)
Sep 30 21:31:28 compute-0 podman[233139]: 2025-09-30 21:31:28.336635253 +0000 UTC m=+0.075060378 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:31:31 compute-0 unix_chkpwd[233187]: password check failed for user (root)
Sep 30 21:31:31 compute-0 sshd-session[233185]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:31:32 compute-0 nova_compute[192810]: 2025-09-30 21:31:32.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:32 compute-0 sshd-session[233185]: Failed password for root from 45.81.23.80 port 49952 ssh2
Sep 30 21:31:32 compute-0 nova_compute[192810]: 2025-09-30 21:31:32.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:33 compute-0 sshd-session[233185]: Received disconnect from 45.81.23.80 port 49952:11: Bye Bye [preauth]
Sep 30 21:31:33 compute-0 sshd-session[233185]: Disconnected from authenticating user root 45.81.23.80 port 49952 [preauth]
Sep 30 21:31:35 compute-0 podman[233190]: 2025-09-30 21:31:35.313408599 +0000 UTC m=+0.049121202 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:31:35 compute-0 podman[233188]: 2025-09-30 21:31:35.323392741 +0000 UTC m=+0.063406143 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, managed_by=edpm_ansible)
Sep 30 21:31:35 compute-0 podman[233189]: 2025-09-30 21:31:35.336727908 +0000 UTC m=+0.063960917 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:31:37 compute-0 nova_compute[192810]: 2025-09-30 21:31:37.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:37 compute-0 nova_compute[192810]: 2025-09-30 21:31:37.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:38.737 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:38.737 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:38.738 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:42 compute-0 nova_compute[192810]: 2025-09-30 21:31:42.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:42 compute-0 nova_compute[192810]: 2025-09-30 21:31:42.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:43 compute-0 nova_compute[192810]: 2025-09-30 21:31:43.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:45.764 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:31:45 compute-0 nova_compute[192810]: 2025-09-30 21:31:45.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:45.766 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:31:45 compute-0 nova_compute[192810]: 2025-09-30 21:31:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:47 compute-0 nova_compute[192810]: 2025-09-30 21:31:47.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:47 compute-0 nova_compute[192810]: 2025-09-30 21:31:47.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:47 compute-0 nova_compute[192810]: 2025-09-30 21:31:47.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:48 compute-0 podman[233250]: 2025-09-30 21:31:48.357910542 +0000 UTC m=+0.092557590 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 21:31:48 compute-0 podman[233251]: 2025-09-30 21:31:48.363646397 +0000 UTC m=+0.093486274 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:31:48 compute-0 nova_compute[192810]: 2025-09-30 21:31:48.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:48 compute-0 nova_compute[192810]: 2025-09-30 21:31:48.786 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:31:49 compute-0 nova_compute[192810]: 2025-09-30 21:31:49.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:49 compute-0 nova_compute[192810]: 2025-09-30 21:31:49.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:49 compute-0 nova_compute[192810]: 2025-09-30 21:31:49.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:49 compute-0 nova_compute[192810]: 2025-09-30 21:31:49.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:49 compute-0 nova_compute[192810]: 2025-09-30 21:31:49.822 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:31:49 compute-0 podman[233296]: 2025-09-30 21:31:49.940793669 +0000 UTC m=+0.087123943 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.014 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.015 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.31907272338867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.015 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.016 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.129 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.129 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.147 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.290 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.291 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.322 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.373 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.421 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.441 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.466 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:31:50 compute-0 nova_compute[192810]: 2025-09-30 21:31:50.467 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.451s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:31:51 compute-0 nova_compute[192810]: 2025-09-30 21:31:51.462 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:51 compute-0 nova_compute[192810]: 2025-09-30 21:31:51.463 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:51 compute-0 nova_compute[192810]: 2025-09-30 21:31:51.463 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:31:51 compute-0 nova_compute[192810]: 2025-09-30 21:31:51.490 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:31:51 compute-0 nova_compute[192810]: 2025-09-30 21:31:51.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:52 compute-0 nova_compute[192810]: 2025-09-30 21:31:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:31:52.767 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:31:52 compute-0 nova_compute[192810]: 2025-09-30 21:31:52.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:53 compute-0 nova_compute[192810]: 2025-09-30 21:31:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:54 compute-0 nova_compute[192810]: 2025-09-30 21:31:54.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:31:57 compute-0 nova_compute[192810]: 2025-09-30 21:31:57.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:58 compute-0 nova_compute[192810]: 2025-09-30 21:31:58.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:31:59 compute-0 podman[233315]: 2025-09-30 21:31:59.316523826 +0000 UTC m=+0.054297993 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:31:59 compute-0 podman[233316]: 2025-09-30 21:31:59.336617154 +0000 UTC m=+0.066352908 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Sep 30 21:32:02 compute-0 nova_compute[192810]: 2025-09-30 21:32:02.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:03 compute-0 nova_compute[192810]: 2025-09-30 21:32:03.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:06 compute-0 podman[233366]: 2025-09-30 21:32:06.341952569 +0000 UTC m=+0.065110566 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:32:06 compute-0 podman[233364]: 2025-09-30 21:32:06.364039928 +0000 UTC m=+0.099412864 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:32:06 compute-0 podman[233365]: 2025-09-30 21:32:06.372263126 +0000 UTC m=+0.105522778 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:32:06 compute-0 unix_chkpwd[233426]: password check failed for user (root)
Sep 30 21:32:06 compute-0 sshd-session[233362]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 21:32:07 compute-0 nova_compute[192810]: 2025-09-30 21:32:07.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:08 compute-0 nova_compute[192810]: 2025-09-30 21:32:08.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:09 compute-0 sshd-session[233362]: Failed password for root from 80.94.95.116 port 38456 ssh2
Sep 30 21:32:10 compute-0 sshd-session[233362]: Connection closed by authenticating user root 80.94.95.116 port 38456 [preauth]
Sep 30 21:32:12 compute-0 nova_compute[192810]: 2025-09-30 21:32:12.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:13 compute-0 nova_compute[192810]: 2025-09-30 21:32:13.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:16 compute-0 ovn_controller[94912]: 2025-09-30T21:32:16Z|00327|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 21:32:16 compute-0 nova_compute[192810]: 2025-09-30 21:32:16.920 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:16 compute-0 nova_compute[192810]: 2025-09-30 21:32:16.920 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:16 compute-0 nova_compute[192810]: 2025-09-30 21:32:16.944 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.050 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.051 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.057 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.058 2 INFO nova.compute.claims [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.269 2 DEBUG nova.compute.provider_tree [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.283 2 DEBUG nova.scheduler.client.report [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.303 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.304 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.392 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.392 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.415 2 INFO nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.434 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.563 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.564 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.564 2 INFO nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Creating image(s)
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.565 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.565 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.565 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.576 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.595 2 DEBUG nova.policy [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.637 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.638 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.638 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.658 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.747 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.749 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.898 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk 1073741824" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.899 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.900 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.956 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.957 2 DEBUG nova.virt.disk.api [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Checking if we can resize image /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:32:17 compute-0 nova_compute[192810]: 2025-09-30 21:32:17.957 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.029 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.030 2 DEBUG nova.virt.disk.api [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Cannot resize image /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.031 2 DEBUG nova.objects.instance [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lazy-loading 'migration_context' on Instance uuid f4cfcb8f-ae48-41bf-b39c-597a639f3a68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.045 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.046 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Ensure instance console log exists: /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.047 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.047 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.048 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:18 compute-0 nova_compute[192810]: 2025-09-30 21:32:18.519 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Successfully created port: c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:32:19 compute-0 podman[233443]: 2025-09-30 21:32:19.326514235 +0000 UTC m=+0.062052769 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm)
Sep 30 21:32:19 compute-0 podman[233442]: 2025-09-30 21:32:19.350305037 +0000 UTC m=+0.091256664 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.454 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Successfully updated port: c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.486 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.486 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquired lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.486 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.560 2 DEBUG nova.compute.manager [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.560 2 DEBUG nova.compute.manager [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing instance network info cache due to event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.560 2 DEBUG oslo_concurrency.lockutils [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:19 compute-0 nova_compute[192810]: 2025-09-30 21:32:19.636 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:32:20 compute-0 podman[233486]: 2025-09-30 21:32:20.320282641 +0000 UTC m=+0.058331349 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.685 2 DEBUG nova.network.neutron [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.734 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Releasing lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.734 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Instance network_info: |[{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.735 2 DEBUG oslo_concurrency.lockutils [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.735 2 DEBUG nova.network.neutron [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.737 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Start _get_guest_xml network_info=[{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.742 2 WARNING nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.746 2 DEBUG nova.virt.libvirt.host [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.746 2 DEBUG nova.virt.libvirt.host [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.749 2 DEBUG nova.virt.libvirt.host [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.750 2 DEBUG nova.virt.libvirt.host [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.750 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.751 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.751 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.751 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.751 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.752 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.752 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.752 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.752 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.753 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.753 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.753 2 DEBUG nova.virt.hardware [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.756 2 DEBUG nova.virt.libvirt.vif [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2099014552',display_name='tempest-SecurityGroupsTestJSON-server-2099014552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2099014552',id=86,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f2302f440e044e8be2476ceca24a473',ramdisk_id='',reservation_id='r-by2uio6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1418534006',owner_user_name='tempest-SecurityGroupsTestJSON-1418534006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:32:17Z,user_data=None,user_id='002d2f39090c4281bae7f4aba96cef9f',uuid=f4cfcb8f-ae48-41bf-b39c-597a639f3a68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.756 2 DEBUG nova.network.os_vif_util [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converting VIF {"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.757 2 DEBUG nova.network.os_vif_util [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.758 2 DEBUG nova.objects.instance [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lazy-loading 'pci_devices' on Instance uuid f4cfcb8f-ae48-41bf-b39c-597a639f3a68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.780 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <uuid>f4cfcb8f-ae48-41bf-b39c-597a639f3a68</uuid>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <name>instance-00000056</name>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:name>tempest-SecurityGroupsTestJSON-server-2099014552</nova:name>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:32:20</nova:creationTime>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:user uuid="002d2f39090c4281bae7f4aba96cef9f">tempest-SecurityGroupsTestJSON-1418534006-project-member</nova:user>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:project uuid="9f2302f440e044e8be2476ceca24a473">tempest-SecurityGroupsTestJSON-1418534006</nova:project>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         <nova:port uuid="c5c9974c-d1b4-4d55-b299-5bd39f80dba6">
Sep 30 21:32:20 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <system>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="serial">f4cfcb8f-ae48-41bf-b39c-597a639f3a68</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="uuid">f4cfcb8f-ae48-41bf-b39c-597a639f3a68</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </system>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <os>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </os>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <features>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </features>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.config"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:e6:b3:a9"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <target dev="tapc5c9974c-d1"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/console.log" append="off"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <video>
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </video>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:32:20 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:32:20 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:32:20 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:32:20 compute-0 nova_compute[192810]: </domain>
Sep 30 21:32:20 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.782 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Preparing to wait for external event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.782 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.782 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.782 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.783 2 DEBUG nova.virt.libvirt.vif [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2099014552',display_name='tempest-SecurityGroupsTestJSON-server-2099014552',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2099014552',id=86,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f2302f440e044e8be2476ceca24a473',ramdisk_id='',reservation_id='r-by2uio6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-SecurityGroupsTestJSON-1418534006',owner_user_name='tempest-SecurityGroupsTestJSON-1418534006-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:32:17Z,user_data=None,user_id='002d2f39090c4281bae7f4aba96cef9f',uuid=f4cfcb8f-ae48-41bf-b39c-597a639f3a68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.783 2 DEBUG nova.network.os_vif_util [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converting VIF {"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.784 2 DEBUG nova.network.os_vif_util [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.784 2 DEBUG os_vif [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.785 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.785 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.788 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5c9974c-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5c9974c-d1, col_values=(('external_ids', {'iface-id': 'c5c9974c-d1b4-4d55-b299-5bd39f80dba6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e6:b3:a9', 'vm-uuid': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:20 compute-0 NetworkManager[51733]: <info>  [1759267940.7912] manager: (tapc5c9974c-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.797 2 INFO os_vif [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1')
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.901 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.901 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.902 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] No VIF found with MAC fa:16:3e:e6:b3:a9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:32:20 compute-0 nova_compute[192810]: 2025-09-30 21:32:20.902 2 INFO nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Using config drive
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.601 2 INFO nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Creating config drive at /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.config
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.605 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsnsdxzy0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.730 2 DEBUG oslo_concurrency.processutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsnsdxzy0" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:21 compute-0 kernel: tapc5c9974c-d1: entered promiscuous mode
Sep 30 21:32:21 compute-0 NetworkManager[51733]: <info>  [1759267941.7884] manager: (tapc5c9974c-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Sep 30 21:32:21 compute-0 ovn_controller[94912]: 2025-09-30T21:32:21Z|00328|binding|INFO|Claiming lport c5c9974c-d1b4-4d55-b299-5bd39f80dba6 for this chassis.
Sep 30 21:32:21 compute-0 ovn_controller[94912]: 2025-09-30T21:32:21Z|00329|binding|INFO|c5c9974c-d1b4-4d55-b299-5bd39f80dba6: Claiming fa:16:3e:e6:b3:a9 10.100.0.9
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.803 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:b3:a9 10.100.0.9'], port_security=['fa:16:3e:e6:b3:a9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f2302f440e044e8be2476ceca24a473', 'neutron:revision_number': '2', 'neutron:security_group_ids': '25f79e3a-3cea-479f-b28e-51d8fb5663ce', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3322f30-0ba7-4080-8269-173d99dc2779, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c5c9974c-d1b4-4d55-b299-5bd39f80dba6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.804 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c5c9974c-d1b4-4d55-b299-5bd39f80dba6 in datapath ef8c2eaa-a409-4066-a902-4b34ecd6ca56 bound to our chassis
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.805 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef8c2eaa-a409-4066-a902-4b34ecd6ca56
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.815 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa60509-8d56-41d7-983b-c18a310df869]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.816 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef8c2eaa-a1 in ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.818 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef8c2eaa-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.818 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c4452567-5a3c-4625-8b57-e8aacca27755]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 systemd-udevd[233523]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.818 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[14842eba-c39a-4952-b6c5-28b9bf4ac614]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.829 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1aab5f85-9f5a-45c7-8028-e4be0e360e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 systemd-machined[152794]: New machine qemu-41-instance-00000056.
Sep 30 21:32:21 compute-0 NetworkManager[51733]: <info>  [1759267941.8336] device (tapc5c9974c-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:32:21 compute-0 NetworkManager[51733]: <info>  [1759267941.8343] device (tapc5c9974c-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:21 compute-0 ovn_controller[94912]: 2025-09-30T21:32:21Z|00330|binding|INFO|Setting lport c5c9974c-d1b4-4d55-b299-5bd39f80dba6 ovn-installed in OVS
Sep 30 21:32:21 compute-0 ovn_controller[94912]: 2025-09-30T21:32:21Z|00331|binding|INFO|Setting lport c5c9974c-d1b4-4d55-b299-5bd39f80dba6 up in Southbound
Sep 30 21:32:21 compute-0 nova_compute[192810]: 2025-09-30 21:32:21.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:21 compute-0 systemd[1]: Started Virtual Machine qemu-41-instance-00000056.
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.860 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8e6a7e0e-e7fa-4812-8bf3-95e659ac764a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.884 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[38f0589f-ef8e-4a1e-b890-5f001fdd40d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.890 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b15ba192-e4d3-44e4-8fdd-052a4f6e8b90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 NetworkManager[51733]: <info>  [1759267941.8909] manager: (tapef8c2eaa-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/156)
Sep 30 21:32:21 compute-0 systemd-udevd[233527]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.918 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ee7c18-beff-436d-9af6-e1d5066f9058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.921 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b68572-7560-444b-bb9c-e1ab3fb0f1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 NetworkManager[51733]: <info>  [1759267941.9406] device (tapef8c2eaa-a0): carrier: link connected
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.945 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[45a0a288-8af0-45cb-8bb6-5909275cfcfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.962 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[66d4bd5f-e29d-4ca5-9755-aa3982617578]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef8c2eaa-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:da:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463756, 'reachable_time': 43787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233556, 'error': None, 'target': 'ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.980 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cf25d50b-5b30-4afe-b54c-5ba627d72429]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:da29'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463756, 'tstamp': 463756}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233557, 'error': None, 'target': 'ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:21.995 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[426b8cb7-e3fe-4e6e-a2a4-4d483dc3c7a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef8c2eaa-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:da:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463756, 'reachable_time': 43787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233558, 'error': None, 'target': 'ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.023 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6236a3b4-ae83-48c7-a59a-4235264080c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.077 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d044d0c8-964d-40c4-98a3-2e8a3c121e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.078 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef8c2eaa-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.078 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.079 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef8c2eaa-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:22 compute-0 kernel: tapef8c2eaa-a0: entered promiscuous mode
Sep 30 21:32:22 compute-0 nova_compute[192810]: 2025-09-30 21:32:22.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:22 compute-0 NetworkManager[51733]: <info>  [1759267942.0821] manager: (tapef8c2eaa-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.087 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef8c2eaa-a0, col_values=(('external_ids', {'iface-id': '62495f0c-d7e5-48f2-b38c-bbf4e1f3b705'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:22 compute-0 nova_compute[192810]: 2025-09-30 21:32:22.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:22 compute-0 ovn_controller[94912]: 2025-09-30T21:32:22Z|00332|binding|INFO|Releasing lport 62495f0c-d7e5-48f2-b38c-bbf4e1f3b705 from this chassis (sb_readonly=0)
Sep 30 21:32:22 compute-0 nova_compute[192810]: 2025-09-30 21:32:22.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.093 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef8c2eaa-a409-4066-a902-4b34ecd6ca56.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef8c2eaa-a409-4066-a902-4b34ecd6ca56.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.094 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d4e2511e-d297-4c8b-b630-529eed1a0488]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.095 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-ef8c2eaa-a409-4066-a902-4b34ecd6ca56
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/ef8c2eaa-a409-4066-a902-4b34ecd6ca56.pid.haproxy
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID ef8c2eaa-a409-4066-a902-4b34ecd6ca56
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:32:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:22.095 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'env', 'PROCESS_TAG=haproxy-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef8c2eaa-a409-4066-a902-4b34ecd6ca56.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:32:22 compute-0 nova_compute[192810]: 2025-09-30 21:32:22.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:22 compute-0 nova_compute[192810]: 2025-09-30 21:32:22.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:22 compute-0 podman[233590]: 2025-09-30 21:32:22.437351211 +0000 UTC m=+0.028298353 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:32:23 compute-0 podman[233590]: 2025-09-30 21:32:23.029805647 +0000 UTC m=+0.620752769 container create 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.089 2 DEBUG nova.compute.manager [req-81556ebf-bdd1-4cbd-8a9e-cfc165776365 req-cf55ca72-1a71-4412-96f7-5309a4e90718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.089 2 DEBUG oslo_concurrency.lockutils [req-81556ebf-bdd1-4cbd-8a9e-cfc165776365 req-cf55ca72-1a71-4412-96f7-5309a4e90718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.090 2 DEBUG oslo_concurrency.lockutils [req-81556ebf-bdd1-4cbd-8a9e-cfc165776365 req-cf55ca72-1a71-4412-96f7-5309a4e90718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.090 2 DEBUG oslo_concurrency.lockutils [req-81556ebf-bdd1-4cbd-8a9e-cfc165776365 req-cf55ca72-1a71-4412-96f7-5309a4e90718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.090 2 DEBUG nova.compute.manager [req-81556ebf-bdd1-4cbd-8a9e-cfc165776365 req-cf55ca72-1a71-4412-96f7-5309a4e90718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Processing event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:32:23 compute-0 systemd[1]: Started libpod-conmon-5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c.scope.
Sep 30 21:32:23 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:32:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f018d9171b43cb8cdbf31dd1252e55d17f8c04c45fb43ff9c938773d2221aae1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:32:23 compute-0 podman[233590]: 2025-09-30 21:32:23.33316623 +0000 UTC m=+0.924113372 container init 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:32:23 compute-0 podman[233590]: 2025-09-30 21:32:23.339105915 +0000 UTC m=+0.930053037 container start 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:32:23 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [NOTICE]   (233616) : New worker (233618) forked
Sep 30 21:32:23 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [NOTICE]   (233616) : Loading success.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.421 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267943.4204793, f4cfcb8f-ae48-41bf-b39c-597a639f3a68 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.421 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] VM Started (Lifecycle Event)
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.424 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.428 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.431 2 INFO nova.virt.libvirt.driver [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Instance spawned successfully.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.432 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.452 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.458 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.461 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.462 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.462 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.463 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.463 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.464 2 DEBUG nova.virt.libvirt.driver [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.493 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.494 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267943.4240816, f4cfcb8f-ae48-41bf-b39c-597a639f3a68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.494 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] VM Paused (Lifecycle Event)
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.524 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.527 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267943.427925, f4cfcb8f-ae48-41bf-b39c-597a639f3a68 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.527 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] VM Resumed (Lifecycle Event)
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.551 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.552 2 INFO nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Took 5.99 seconds to spawn the instance on the hypervisor.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.553 2 DEBUG nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.557 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.582 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.632 2 INFO nova.compute.manager [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Took 6.63 seconds to build instance.
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.659 2 DEBUG nova.network.neutron [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updated VIF entry in instance network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.660 2 DEBUG nova.network.neutron [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.673 2 DEBUG oslo_concurrency.lockutils [None req-574c7f4e-c0df-458d-908b-dda104de1ce5 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:23 compute-0 nova_compute[192810]: 2025-09-30 21:32:23.677 2 DEBUG oslo_concurrency.lockutils [req-4a513257-7117-49ee-a801-a8f38f319789 req-b62eb758-289d-435c-b91c-8d57179abfa1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.537 2 DEBUG nova.compute.manager [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.537 2 DEBUG oslo_concurrency.lockutils [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.538 2 DEBUG oslo_concurrency.lockutils [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.539 2 DEBUG oslo_concurrency.lockutils [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.539 2 DEBUG nova.compute.manager [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] No waiting events found dispatching network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.539 2 WARNING nova.compute.manager [req-9b34f2f8-78e4-492f-97a6-bb6359420794 req-1fcfcff0-0517-4a3e-9029-b1a64ad57d7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received unexpected event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 for instance with vm_state active and task_state None.
Sep 30 21:32:25 compute-0 nova_compute[192810]: 2025-09-30 21:32:25.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:26 compute-0 sshd-session[233627]: Invalid user test from 45.81.23.80 port 44798
Sep 30 21:32:26 compute-0 sshd-session[233627]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:32:26 compute-0 sshd-session[233627]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:32:26 compute-0 nova_compute[192810]: 2025-09-30 21:32:26.935 2 DEBUG nova.compute.manager [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:26 compute-0 nova_compute[192810]: 2025-09-30 21:32:26.936 2 DEBUG nova.compute.manager [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing instance network info cache due to event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:32:26 compute-0 nova_compute[192810]: 2025-09-30 21:32:26.936 2 DEBUG oslo_concurrency.lockutils [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:26 compute-0 nova_compute[192810]: 2025-09-30 21:32:26.937 2 DEBUG oslo_concurrency.lockutils [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:26 compute-0 nova_compute[192810]: 2025-09-30 21:32:26.937 2 DEBUG nova.network.neutron [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:32:27 compute-0 nova_compute[192810]: 2025-09-30 21:32:27.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:28 compute-0 sshd-session[233627]: Failed password for invalid user test from 45.81.23.80 port 44798 ssh2
Sep 30 21:32:28 compute-0 nova_compute[192810]: 2025-09-30 21:32:28.207 2 DEBUG nova.compute.manager [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:28 compute-0 nova_compute[192810]: 2025-09-30 21:32:28.208 2 DEBUG nova.compute.manager [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing instance network info cache due to event network-changed-c5c9974c-d1b4-4d55-b299-5bd39f80dba6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:32:28 compute-0 nova_compute[192810]: 2025-09-30 21:32:28.209 2 DEBUG oslo_concurrency.lockutils [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:28 compute-0 sshd-session[233627]: Received disconnect from 45.81.23.80 port 44798:11: Bye Bye [preauth]
Sep 30 21:32:28 compute-0 sshd-session[233627]: Disconnected from invalid user test 45.81.23.80 port 44798 [preauth]
Sep 30 21:32:29 compute-0 nova_compute[192810]: 2025-09-30 21:32:29.028 2 DEBUG nova.network.neutron [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updated VIF entry in instance network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:32:29 compute-0 nova_compute[192810]: 2025-09-30 21:32:29.029 2 DEBUG nova.network.neutron [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:29 compute-0 nova_compute[192810]: 2025-09-30 21:32:29.054 2 DEBUG oslo_concurrency.lockutils [req-b47d8854-6461-4399-b041-8ada354f9ad0 req-9e1d7705-7e3b-4e9c-98e3-94e033fc05f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:29 compute-0 nova_compute[192810]: 2025-09-30 21:32:29.055 2 DEBUG oslo_concurrency.lockutils [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:29 compute-0 nova_compute[192810]: 2025-09-30 21:32:29.055 2 DEBUG nova.network.neutron [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Refreshing network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:32:30 compute-0 podman[233629]: 2025-09-30 21:32:30.338473727 +0000 UTC m=+0.067542463 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:32:30 compute-0 podman[233630]: 2025-09-30 21:32:30.33857308 +0000 UTC m=+0.066910179 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Sep 30 21:32:30 compute-0 nova_compute[192810]: 2025-09-30 21:32:30.778 2 DEBUG nova.network.neutron [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updated VIF entry in instance network info cache for port c5c9974c-d1b4-4d55-b299-5bd39f80dba6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:32:30 compute-0 nova_compute[192810]: 2025-09-30 21:32:30.778 2 DEBUG nova.network.neutron [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:30 compute-0 nova_compute[192810]: 2025-09-30 21:32:30.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:30 compute-0 nova_compute[192810]: 2025-09-30 21:32:30.794 2 DEBUG oslo_concurrency.lockutils [req-93a8f9bd-7c37-49c8-865c-3d9aaa5fe6e3 req-cba75cc5-c212-46cc-aa20-968fa2924f7d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:32 compute-0 nova_compute[192810]: 2025-09-30 21:32:32.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:35 compute-0 nova_compute[192810]: 2025-09-30 21:32:35.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:37 compute-0 podman[233684]: 2025-09-30 21:32:37.321286114 +0000 UTC m=+0.051809039 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:32:37 compute-0 podman[233686]: 2025-09-30 21:32:37.323095028 +0000 UTC m=+0.047106543 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:32:37 compute-0 podman[233685]: 2025-09-30 21:32:37.338215428 +0000 UTC m=+0.063283559 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:32:37 compute-0 nova_compute[192810]: 2025-09-30 21:32:37.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:38.737 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:38.738 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:38.739 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:38 compute-0 ovn_controller[94912]: 2025-09-30T21:32:38Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e6:b3:a9 10.100.0.9
Sep 30 21:32:38 compute-0 ovn_controller[94912]: 2025-09-30T21:32:38Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e6:b3:a9 10.100.0.9
Sep 30 21:32:40 compute-0 nova_compute[192810]: 2025-09-30 21:32:40.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:42 compute-0 nova_compute[192810]: 2025-09-30 21:32:42.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.908 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000056', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9f2302f440e044e8be2476ceca24a473', 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'hostId': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.929 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.bytes volume: 72851456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.930 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bb35b42-ba58-4e12-92b9-0f32ecd80144', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72851456, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.909726', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '000d8092-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'f263e4e28dcf2de6fe2b1d136da2dcf54c86b8ee22714cc53e6375f15652f72e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.909726', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '000d8e2a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '3c400f8c386199496468bad3bfa52053d36a7015387dc0ac84d3ad601a75ef94'}]}, 'timestamp': '2025-09-30 21:32:43.930693', '_unique_id': '142690987cb44637a5329e065263f2d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.932 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.933 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.935 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f4cfcb8f-ae48-41bf-b39c-597a639f3a68 / tapc5c9974c-d1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.935 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '137d4d3b-8930-4d32-96d9-dd8f43c11e6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.933297', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '000e56e8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '48d72607aa2a67dba213ddb8287b549863e4f95c366f2f6aed72a4864e789c2e'}]}, 'timestamp': '2025-09-30 21:32:43.935847', '_unique_id': 'd1f5eb23f86e4798b47a91eeb3c94640'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.936 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.937 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.937 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>]
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.937 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9b4a05b-5799-4314-8b56-79f3b54916be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.937427', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '000e9e64-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '73a3ac077f96b2371dbffd7c5e12883287d7b3ee531ef6a11e52483cc08ed380'}]}, 'timestamp': '2025-09-30 21:32:43.937731', '_unique_id': '4072fc872b6441a8afc923597c72e117'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.939 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.939 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>]
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.939 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.939 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.939 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9458ef4-068b-4737-9e4e-40c2a6cea988', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.939352', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '000eeac2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '0d2009c3e514b63cdc079901535140873910d63429f9227e421ab7461a3162ea'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.939352', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '000ef5b2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '1c1aebed056a9faaa455d3c22c4792a34552dad0eb36c4dadded267b88611433'}]}, 'timestamp': '2025-09-30 21:32:43.939879', '_unique_id': 'bc41d7bde8254713b39eb42c97898425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.940 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.incoming.bytes volume: 1562 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92fc9a5-68b6-43ab-a23c-0cc8ef8c2610', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1562, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.941100', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '000f2df2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '433252ec43cbc8e0cb71301a96a77bd3e2bdb4abbdecb55af0b0e91f66dd9174'}]}, 'timestamp': '2025-09-30 21:32:43.941334', '_unique_id': '68daa2a91fd5404492b6c061ff611175'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.941 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.942 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.958 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/cpu volume: 13650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fdc4f6f-584f-43a3-a194-dd95f2819d71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13650000000, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'timestamp': '2025-09-30T21:32:43.942380', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '0011e3e4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.646088487, 'message_signature': '42168d3550c8a3ac293c9af2ebc3c02b462f67490ecaed6fe27752b45bd93478'}]}, 'timestamp': '2025-09-30 21:32:43.959150', '_unique_id': 'eb7f0feb0f6f47019c4c7ff35129dc83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.959 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.960 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02310668-5543-4d5d-aea1-b75f28d0af95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.961049', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '00123b0a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '5ff407e5c06f47a7daa3060040d50fa48f66a3b69020e340973cfc823032d7d1'}]}, 'timestamp': '2025-09-30 21:32:43.961338', '_unique_id': '7800afcadefe458a88d471a4697ad5a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.961 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.962 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.962 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.962 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>]
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.963 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.975 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.usage volume: 29884416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.975 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d420372-5116-4ccc-bd49-22111c29d198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29884416, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.963215', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00147410-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': '042cf12b80480f099dc0a1300b527ee9511535388a4ff7180afa27731d90f085'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.963215', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00147f6e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': '99d6ae299b647e56976706441bcbf960ce6e922f458c822907af2a42cb4ba8c0'}]}, 'timestamp': '2025-09-30 21:32:43.976183', '_unique_id': 'c5f585942b014f81bcc61e842f9457d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.976 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.977 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.978 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.bytes volume: 30743040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.978 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd86092b5-bb5b-41aa-906f-52228a75cb5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30743040, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.978006', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0014d0cc-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'e046a848c6b2383a948b4389a461f7b9e3bec3bcdcf4efab5e82c2e16e5d2a4d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.978006', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0014d9d2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '5cc2c06794883bc2ac0843a2c4bd05823b2c16b8274e5b95e7b70439e2398751'}]}, 'timestamp': '2025-09-30 21:32:43.978486', '_unique_id': '26c3ae58c8c748a097e0c910a798e37d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.979 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94c06c8c-243c-4acc-ae64-24fd8ab3678b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.979940', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '00151cbc-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': 'cb6c2c56eb0584f90d455b2b78646270ae0656870a03a315fd5bb8e0a0ba17d4'}]}, 'timestamp': '2025-09-30 21:32:43.980214', '_unique_id': 'c7c8fb31e1f24599a4d44818fb8452e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.980 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.981 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.981 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f790f8-0e08-48fb-807d-814f7256144d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.981658', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '00155f88-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '820ef192562ff2fd03475193b3af833b19546a74b547022835872d36b341185a'}]}, 'timestamp': '2025-09-30 21:32:43.981922', '_unique_id': '3eb80167466046b58c20d8752bb6e2e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.983 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.983 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa8b3696-330d-49af-b236-2e93b05b1357', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.983384', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '0015a2e0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '22fba1caee5e64b1e774fcc2596f8fe0736a7fe5d983aca1ca464f1238d97475'}]}, 'timestamp': '2025-09-30 21:32:43.983673', '_unique_id': '8871c08bb3a8453d94506490a575e056'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.984 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db1d1d0a-9eac-4a1a-9ee7-cbe1ceb0e477', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.985036', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '0015e368-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': '1f0d0e5fd1b651b0dabf559e5a6c0a052f02f2b30dd98cafc98fdc2e83db7914'}]}, 'timestamp': '2025-09-30 21:32:43.985297', '_unique_id': '6db114024da94400a1632f132eb7e983'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.986 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.986 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.requests volume: 1107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.986 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc8ec57b-3541-4b2f-9ee7-ae688c5aed1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1107, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.986678', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0016247c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'b9c5bc6276dd6a18f2315760ea8b4d26c3f02ba30ba08e9ab86af8a3bfe92018'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.986678', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00162d8c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'de31c03e524b27e79fc306a61735c01e524e73c021c2fbd7edd017d36bb01899'}]}, 'timestamp': '2025-09-30 21:32:43.987183', '_unique_id': 'a2896ed3745c4f3d90672ff4d6d0773a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.988 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.988 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-SecurityGroupsTestJSON-server-2099014552>]
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.latency volume: 2052714671 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00a7a531-26f2-4d53-8c9c-6c41ac634c10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2052714671, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.989034', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00167f3a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '7dde2d028dbab5e29cc71cc62d24ba568864bf0709f10157e17cfe530712588c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.989034', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0016884a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': '9578976b85df940d1d94394d4d43fe3cb58bd87bfb973a0eaf571c076cf6d99f'}]}, 'timestamp': '2025-09-30 21:32:43.989505', '_unique_id': '0b62c3e132bd49e39cd177f5cb3dabfc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.990 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.990 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07ba2a4d-e25e-450c-8c25-1d61d9eb805f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.990951', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '0016ca80-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': 'a5e5b9fff07737ab3418e9d421be29161e0868b87ee0943b6239f7b4db8c0e46'}]}, 'timestamp': '2025-09-30 21:32:43.991215', '_unique_id': 'e93726bf42b74fe48ae3d879815bce47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.992 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.992 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39ac9310-40bd-4f54-976d-ce9bcb9d3b6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.992582', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00170a18-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': 'd6386438a0455ebf496f69568dd69b6f1ef693e229b630748e6c6e33d276a250'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.992582', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00171350-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': '4a0d8ed70d95481ee643515e0879ef0eb38bb8644374bcfbdb9a496fdda6f26d'}]}, 'timestamp': '2025-09-30 21:32:43.993068', '_unique_id': '08529bc2fd2443ff811e856faf582509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.993 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.994 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.994 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/memory.usage volume: 40.41015625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dfa3ee1-a593-4f90-b3c8-4c6f44971896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.41015625, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'timestamp': '2025-09-30T21:32:43.994491', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '00175590-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.646088487, 'message_signature': 'ac450eee4144a459554fe79f6ae98f3a175a67283ee43506df5f643746a6c2c8'}]}, 'timestamp': '2025-09-30 21:32:43.994794', '_unique_id': '2e7c6d754ba44264b5f2bb1af7629ea3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.995 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e89f2425-e5de-4557-8ff6-1ff54cc4be81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'instance-00000056-f4cfcb8f-ae48-41bf-b39c-597a639f3a68-tapc5c9974c-d1', 'timestamp': '2025-09-30T21:32:43.996146', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'tapc5c9974c-d1', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e6:b3:a9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5c9974c-d1'}, 'message_id': '00179532-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.620805968, 'message_signature': 'b87ddacc6e61478b0873899e50caf7fe585a3c1a4890ff75cf4a57183a16d62d'}]}, 'timestamp': '2025-09-30 21:32:43.996401', '_unique_id': '4ad239f97e804b58a89f9a86ea31b5cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.997 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.997 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.allocation volume: 30547968 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7393f071-9df1-474f-bb13-65137630e78d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30547968, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.997827', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0017d6e6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': '2c8cbfae06abcdba30ee5da57f4d90c0fc761787293c2128646298f99f89447a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.997827', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0017e01e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.65071514, 'message_signature': '6a6681a97fae5e17328775a662e2096dea21138adf3dd92233df82e2ad389f23'}]}, 'timestamp': '2025-09-30 21:32:43.998309', '_unique_id': 'a7953f5f859741c982da3d9f0cfbd202'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.999 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:32:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:43.999 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.latency volume: 2761512325 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 DEBUG ceilometer.compute.pollsters [-] f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk.device.read.latency volume: 118816686 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '358799b9-bae7-41f6-ba7c-e6d9f61e36f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2761512325, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-vda', 'timestamp': '2025-09-30T21:32:43.999861', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00182678-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'd986554577b7b862bab35b103e38442b0d95f97d3698a8c60441d8f1aec900e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 118816686, 'user_id': '002d2f39090c4281bae7f4aba96cef9f', 'user_name': None, 'project_id': '9f2302f440e044e8be2476ceca24a473', 'project_name': None, 'resource_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68-sda', 'timestamp': '2025-09-30T21:32:43.999861', 'resource_metadata': {'display_name': 'tempest-SecurityGroupsTestJSON-server-2099014552', 'name': 'instance-00000056', 'instance_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'instance_type': 'm1.nano', 'host': 'e34f1991d2ab58e10cd16222df4b062d46d987e683d8ba935040bd54', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00182f92-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4659.597224991, 'message_signature': 'b1e7dbe39c1560c0650a4c69e29ed08400e1dda7d08ba04371a8402624c4330f'}]}, 'timestamp': '2025-09-30 21:32:44.000342', '_unique_id': '14b4fd98e5e044cdbf18e97ddeeec9f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:32:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:32:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:32:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:32:45 compute-0 nova_compute[192810]: 2025-09-30 21:32:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:45 compute-0 nova_compute[192810]: 2025-09-30 21:32:45.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:45.989 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:32:45 compute-0 nova_compute[192810]: 2025-09-30 21:32:45.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:45.990 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:32:47 compute-0 nova_compute[192810]: 2025-09-30 21:32:47.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:47 compute-0 nova_compute[192810]: 2025-09-30 21:32:47.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.918 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.918 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.918 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:49 compute-0 nova_compute[192810]: 2025-09-30 21:32:49.919 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:32:50 compute-0 podman[233750]: 2025-09-30 21:32:50.048380732 +0000 UTC m=+0.083227698 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:32:50 compute-0 podman[233749]: 2025-09-30 21:32:50.05118817 +0000 UTC m=+0.085211896 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.077 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.133 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.134 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.184 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.316 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.317 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5572MB free_disk=73.29000091552734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.318 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.318 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance f4cfcb8f-ae48-41bf-b39c-597a639f3a68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.565 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.619 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.634 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.655 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.655 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:50 compute-0 nova_compute[192810]: 2025-09-30 21:32:50.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:51 compute-0 podman[233799]: 2025-09-30 21:32:51.3034564 +0000 UTC m=+0.043285300 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:32:51 compute-0 nova_compute[192810]: 2025-09-30 21:32:51.655 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:51 compute-0 nova_compute[192810]: 2025-09-30 21:32:51.656 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:32:51 compute-0 nova_compute[192810]: 2025-09-30 21:32:51.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:51 compute-0 nova_compute[192810]: 2025-09-30 21:32:51.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:32:51 compute-0 nova_compute[192810]: 2025-09-30 21:32:51.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:32:52 compute-0 nova_compute[192810]: 2025-09-30 21:32:52.038 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:52 compute-0 nova_compute[192810]: 2025-09-30 21:32:52.038 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:52 compute-0 nova_compute[192810]: 2025-09-30 21:32:52.038 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:32:52 compute-0 nova_compute[192810]: 2025-09-30 21:32:52.039 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid f4cfcb8f-ae48-41bf-b39c-597a639f3a68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:52 compute-0 nova_compute[192810]: 2025-09-30 21:32:52.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.087 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [{"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.108 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-f4cfcb8f-ae48-41bf-b39c-597a639f3a68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.108 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.108 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.244 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.245 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.266 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.357 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.358 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.372 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.373 2 INFO nova.compute.claims [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.537 2 DEBUG nova.compute.provider_tree [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.556 2 DEBUG nova.scheduler.client.report [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.587 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.588 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.647 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.648 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.668 2 INFO nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.702 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.842 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.843 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.844 2 INFO nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Creating image(s)
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.844 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.845 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.845 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.857 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.881 2 DEBUG nova.policy [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.909 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.909 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.910 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.921 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.983 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:54 compute-0 nova_compute[192810]: 2025-09-30 21:32:54.984 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:54.991 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.025 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.025 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.026 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.104 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.126 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.127 2 DEBUG nova.virt.disk.api [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Checking if we can resize image /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.128 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.183 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.184 2 DEBUG nova.virt.disk.api [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Cannot resize image /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.185 2 DEBUG nova.objects.instance [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'migration_context' on Instance uuid f92e449c-90c2-4cba-a8c1-ba6b1c82d770 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.203 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.204 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Ensure instance console log exists: /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.204 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.205 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.205 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.595 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Successfully created port: e2cc5531-350c-4845-bde9-9797e9628421 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:32:55 compute-0 nova_compute[192810]: 2025-09-30 21:32:55.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.430 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Successfully updated port: e2cc5531-350c-4845-bde9-9797e9628421 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.457 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.457 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquired lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.458 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.619 2 DEBUG nova.compute.manager [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-changed-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.620 2 DEBUG nova.compute.manager [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Refreshing instance network info cache due to event network-changed-e2cc5531-350c-4845-bde9-9797e9628421. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.620 2 DEBUG oslo_concurrency.lockutils [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:32:56 compute-0 nova_compute[192810]: 2025-09-30 21:32:56.846 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.870 2 DEBUG nova.network.neutron [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updating instance_info_cache with network_info: [{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.897 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Releasing lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.897 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Instance network_info: |[{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.897 2 DEBUG oslo_concurrency.lockutils [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.898 2 DEBUG nova.network.neutron [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Refreshing network info cache for port e2cc5531-350c-4845-bde9-9797e9628421 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.901 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Start _get_guest_xml network_info=[{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.907 2 WARNING nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.912 2 DEBUG nova.virt.libvirt.host [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.912 2 DEBUG nova.virt.libvirt.host [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.918 2 DEBUG nova.virt.libvirt.host [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.919 2 DEBUG nova.virt.libvirt.host [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.920 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.920 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.920 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.920 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.921 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.921 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.921 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.921 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.921 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.922 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.922 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.922 2 DEBUG nova.virt.hardware [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.925 2 DEBUG nova.virt.libvirt.vif [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1093237178',display_name='tempest-ServerActionsTestOtherB-server-1093237178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1093237178',id=90,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-hu7yzh19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:32:54Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=f92e449c-90c2-4cba-a8c1-ba6b1c82d770,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.925 2 DEBUG nova.network.os_vif_util [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.926 2 DEBUG nova.network.os_vif_util [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.926 2 DEBUG nova.objects.instance [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'pci_devices' on Instance uuid f92e449c-90c2-4cba-a8c1-ba6b1c82d770 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.940 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <uuid>f92e449c-90c2-4cba-a8c1-ba6b1c82d770</uuid>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <name>instance-0000005a</name>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestOtherB-server-1093237178</nova:name>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:32:57</nova:creationTime>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:user uuid="b9b3e9f2523944539f57a1ff5d565cb4">tempest-ServerActionsTestOtherB-463525410-project-member</nova:user>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:project uuid="d876c85b6ca5418eb657e48391a6503b">tempest-ServerActionsTestOtherB-463525410</nova:project>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         <nova:port uuid="e2cc5531-350c-4845-bde9-9797e9628421">
Sep 30 21:32:57 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <system>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="serial">f92e449c-90c2-4cba-a8c1-ba6b1c82d770</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="uuid">f92e449c-90c2-4cba-a8c1-ba6b1c82d770</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </system>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <os>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </os>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <features>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </features>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.config"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:3f:a2:bc"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <target dev="tape2cc5531-35"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/console.log" append="off"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <video>
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </video>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:32:57 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:32:57 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:32:57 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:32:57 compute-0 nova_compute[192810]: </domain>
Sep 30 21:32:57 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.941 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Preparing to wait for external event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.942 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.942 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.942 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.943 2 DEBUG nova.virt.libvirt.vif [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:32:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1093237178',display_name='tempest-ServerActionsTestOtherB-server-1093237178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1093237178',id=90,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-hu7yzh19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:32:54Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=f92e449c-90c2-4cba-a8c1-ba6b1c82d770,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.943 2 DEBUG nova.network.os_vif_util [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.944 2 DEBUG nova.network.os_vif_util [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.944 2 DEBUG os_vif [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.945 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.945 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.947 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape2cc5531-35, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.948 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape2cc5531-35, col_values=(('external_ids', {'iface-id': 'e2cc5531-350c-4845-bde9-9797e9628421', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:a2:bc', 'vm-uuid': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:57 compute-0 NetworkManager[51733]: <info>  [1759267977.9500] manager: (tape2cc5531-35): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:57 compute-0 nova_compute[192810]: 2025-09-30 21:32:57.955 2 INFO os_vif [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35')
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.010 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.011 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.011 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] No VIF found with MAC fa:16:3e:3f:a2:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.011 2 INFO nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Using config drive
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.384 2 INFO nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Creating config drive at /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.config
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.388 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4ogp6l3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.510 2 DEBUG oslo_concurrency.processutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4ogp6l3" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:32:58 compute-0 kernel: tape2cc5531-35: entered promiscuous mode
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.5591] manager: (tape2cc5531-35): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00333|binding|INFO|Claiming lport e2cc5531-350c-4845-bde9-9797e9628421 for this chassis.
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00334|binding|INFO|e2cc5531-350c-4845-bde9-9797e9628421: Claiming fa:16:3e:3f:a2:bc 10.100.0.7
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.5765] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.5773] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.582 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:a2:bc 10.100.0.7'], port_security=['fa:16:3e:3f:a2:bc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '59aeae25-c90e-4b40-9e86-3ac03fe94073', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e2cc5531-350c-4845-bde9-9797e9628421) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.584 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e2cc5531-350c-4845-bde9-9797e9628421 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 bound to our chassis
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.585 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:32:58 compute-0 systemd-udevd[233851]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:32:58 compute-0 systemd-machined[152794]: New machine qemu-42-instance-0000005a.
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.596 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d71d95-2eb7-4ffd-a64e-0dee3bd96277]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.597 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap91c84c55-91 in ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.599 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap91c84c55-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.599 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42a282d7-98a9-4758-ba9f-205a71f4e3ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.6004] device (tape2cc5531-35): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.6016] device (tape2cc5531-35): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.602 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ef5229-7ec8-4134-93e0-42837ebf7db6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.611 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[99d090cf-bb3c-4eb5-b567-7e9af3d1715d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 systemd[1]: Started Virtual Machine qemu-42-instance-0000005a.
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.643 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6cdc0c06-dc14-49d5-8dca-e43e37e8b822]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.667 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[11afeaea-f602-4fc2-9b1e-50753878d436]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.686 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e62033-e727-4155-8cd4-860def1ad97c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.6886] manager: (tap91c84c55-90): new Veth device (/org/freedesktop/NetworkManager/Devices/162)
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.713 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8307bdc3-dc82-46c3-a186-a8aa23ccd042]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.715 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[94c97583-d9fc-4d2d-9bc8-fc85820bd5ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.7345] device (tap91c84c55-90): carrier: link connected
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00335|binding|INFO|Releasing lport 62495f0c-d7e5-48f2-b38c-bbf4e1f3b705 from this chassis (sb_readonly=0)
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.740 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[27a0740e-10cd-4c55-90b2-a7f9a63026c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00336|binding|INFO|Setting lport e2cc5531-350c-4845-bde9-9797e9628421 ovn-installed in OVS
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00337|binding|INFO|Setting lport e2cc5531-350c-4845-bde9-9797e9628421 up in Southbound
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.756 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3888708a-dea6-4be4-ba4a-f6763af594d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467435, 'reachable_time': 22852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233885, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.770 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8d540c-d0a1-4e54-8643-cdd599c95ec1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef9:a7ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 467435, 'tstamp': 467435}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233886, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.785 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[68c649a2-f0c9-41d0-ac91-ccfd5c6542cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap91c84c55-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f9:a7:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 101], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467435, 'reachable_time': 22852, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233887, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.810 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f14487-097e-40b8-b061-658835630b71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.864 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7604a1-dd0b-4ba9-9850-84e23414ac7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.865 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.865 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.866 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91c84c55-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:58 compute-0 kernel: tap91c84c55-90: entered promiscuous mode
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 NetworkManager[51733]: <info>  [1759267978.8685] manager: (tap91c84c55-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.873 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap91c84c55-90, col_values=(('external_ids', {'iface-id': '3996e682-c20c-41c5-9547-9688a18f316c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:58 compute-0 ovn_controller[94912]: 2025-09-30T21:32:58Z|00338|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.875 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.876 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29128cdc-68ba-4cb8-85ad-f5353cf0aef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.877 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/91c84c55-96ab-4682-a6e7-9e96514ca8a5.pid.haproxy
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 91c84c55-96ab-4682-a6e7-9e96514ca8a5
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:32:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:32:58.878 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'env', 'PROCESS_TAG=haproxy-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/91c84c55-96ab-4682-a6e7-9e96514ca8a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:32:58 compute-0 nova_compute[192810]: 2025-09-30 21:32:58.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.018 2 DEBUG nova.compute.manager [req-dcd26390-f9fe-49e4-a83c-919dc9e2063a req-2ce1c779-bba1-4322-adbd-ee968a1bd301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.018 2 DEBUG oslo_concurrency.lockutils [req-dcd26390-f9fe-49e4-a83c-919dc9e2063a req-2ce1c779-bba1-4322-adbd-ee968a1bd301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.019 2 DEBUG oslo_concurrency.lockutils [req-dcd26390-f9fe-49e4-a83c-919dc9e2063a req-2ce1c779-bba1-4322-adbd-ee968a1bd301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.019 2 DEBUG oslo_concurrency.lockutils [req-dcd26390-f9fe-49e4-a83c-919dc9e2063a req-2ce1c779-bba1-4322-adbd-ee968a1bd301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.019 2 DEBUG nova.compute.manager [req-dcd26390-f9fe-49e4-a83c-919dc9e2063a req-2ce1c779-bba1-4322-adbd-ee968a1bd301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Processing event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.175 2 DEBUG nova.network.neutron [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updated VIF entry in instance network info cache for port e2cc5531-350c-4845-bde9-9797e9628421. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.176 2 DEBUG nova.network.neutron [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updating instance_info_cache with network_info: [{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.195 2 DEBUG oslo_concurrency.lockutils [req-31c4ba8b-378a-4741-963a-ab36d56a182a req-3c6b0bf4-9c4f-433b-baac-29d887e0bc4f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:32:59 compute-0 podman[233925]: 2025-09-30 21:32:59.232509501 +0000 UTC m=+0.047226376 container create 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:32:59 compute-0 systemd[1]: Started libpod-conmon-5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3.scope.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.286 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.287 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267979.2859552, f92e449c-90c2-4cba-a8c1-ba6b1c82d770 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.287 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] VM Started (Lifecycle Event)
Sep 30 21:32:59 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.292 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:32:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dea51daa6c05ad8e442a6c36df4ddf9123bca49c77b97eb2592a3f966898bb23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.298 2 INFO nova.virt.libvirt.driver [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Instance spawned successfully.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.299 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:32:59 compute-0 podman[233925]: 2025-09-30 21:32:59.207348696 +0000 UTC m=+0.022065601 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:32:59 compute-0 podman[233925]: 2025-09-30 21:32:59.305274292 +0000 UTC m=+0.119991177 container init 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:32:59 compute-0 podman[233925]: 2025-09-30 21:32:59.31092027 +0000 UTC m=+0.125637145 container start 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.313 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.318 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.322 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.323 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.323 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.324 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.324 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.325 2 DEBUG nova.virt.libvirt.driver [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:32:59 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [NOTICE]   (233945) : New worker (233947) forked
Sep 30 21:32:59 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [NOTICE]   (233945) : Loading success.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.358 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.359 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267979.2873986, f92e449c-90c2-4cba-a8c1-ba6b1c82d770 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.359 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] VM Paused (Lifecycle Event)
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.405 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.408 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759267979.2908044, f92e449c-90c2-4cba-a8c1-ba6b1c82d770 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.408 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] VM Resumed (Lifecycle Event)
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.593 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.597 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.602 2 INFO nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Took 4.76 seconds to spawn the instance on the hypervisor.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.602 2 DEBUG nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.631 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.697 2 INFO nova.compute.manager [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Took 5.38 seconds to build instance.
Sep 30 21:32:59 compute-0 nova_compute[192810]: 2025-09-30 21:32:59.719 2 DEBUG oslo_concurrency.lockutils [None req-f666a567-21cc-4ff4-b148-5d7e746d0ff6 b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:01 compute-0 podman[233957]: 2025-09-30 21:33:01.321650748 +0000 UTC m=+0.060331296 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 21:33:01 compute-0 podman[233956]: 2025-09-30 21:33:01.343318159 +0000 UTC m=+0.082002367 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.423 2 DEBUG nova.compute.manager [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.424 2 DEBUG oslo_concurrency.lockutils [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.424 2 DEBUG oslo_concurrency.lockutils [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.424 2 DEBUG oslo_concurrency.lockutils [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.424 2 DEBUG nova.compute.manager [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] No waiting events found dispatching network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.424 2 WARNING nova.compute.manager [req-dc0e7916-1f08-4c11-b544-d434cc268e6f req-52d58648-f236-4f1c-be00-2fa245a7db26 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received unexpected event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 for instance with vm_state active and task_state None.
Sep 30 21:33:01 compute-0 nova_compute[192810]: 2025-09-30 21:33:01.627 2 INFO nova.compute.manager [None req-695023a1-89da-45ff-9b5d-04804ee16e0a b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Get console output
Sep 30 21:33:02 compute-0 nova_compute[192810]: 2025-09-30 21:33:02.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:02 compute-0 nova_compute[192810]: 2025-09-30 21:33:02.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:07 compute-0 nova_compute[192810]: 2025-09-30 21:33:07.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:07 compute-0 nova_compute[192810]: 2025-09-30 21:33:07.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:08 compute-0 podman[234003]: 2025-09-30 21:33:08.324306982 +0000 UTC m=+0.062935291 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=iscsid, org.label-schema.schema-version=1.0)
Sep 30 21:33:08 compute-0 podman[234002]: 2025-09-30 21:33:08.324549518 +0000 UTC m=+0.066257243 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:08 compute-0 podman[234004]: 2025-09-30 21:33:08.345378267 +0000 UTC m=+0.070764022 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:33:11 compute-0 ovn_controller[94912]: 2025-09-30T21:33:11Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3f:a2:bc 10.100.0.7
Sep 30 21:33:11 compute-0 ovn_controller[94912]: 2025-09-30T21:33:11Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3f:a2:bc 10.100.0.7
Sep 30 21:33:12 compute-0 nova_compute[192810]: 2025-09-30 21:33:12.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:12 compute-0 nova_compute[192810]: 2025-09-30 21:33:12.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.724 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.725 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.725 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.726 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.726 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.736 2 INFO nova.compute.manager [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Terminating instance
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.745 2 DEBUG nova.compute.manager [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:33:14 compute-0 kernel: tapc5c9974c-d1 (unregistering): left promiscuous mode
Sep 30 21:33:14 compute-0 NetworkManager[51733]: <info>  [1759267994.7870] device (tapc5c9974c-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:33:14 compute-0 ovn_controller[94912]: 2025-09-30T21:33:14Z|00339|binding|INFO|Releasing lport c5c9974c-d1b4-4d55-b299-5bd39f80dba6 from this chassis (sb_readonly=0)
Sep 30 21:33:14 compute-0 ovn_controller[94912]: 2025-09-30T21:33:14Z|00340|binding|INFO|Setting lport c5c9974c-d1b4-4d55-b299-5bd39f80dba6 down in Southbound
Sep 30 21:33:14 compute-0 ovn_controller[94912]: 2025-09-30T21:33:14Z|00341|binding|INFO|Removing iface tapc5c9974c-d1 ovn-installed in OVS
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:14.807 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:b3:a9 10.100.0.9'], port_security=['fa:16:3e:e6:b3:a9 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f4cfcb8f-ae48-41bf-b39c-597a639f3a68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f2302f440e044e8be2476ceca24a473', 'neutron:revision_number': '6', 'neutron:security_group_ids': '25f79e3a-3cea-479f-b28e-51d8fb5663ce 81413aa3-1119-4001-8732-d2f30500db0a 99a2c2d8-b44e-4bd2-b1ff-606bba0b2d20', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e3322f30-0ba7-4080-8269-173d99dc2779, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c5c9974c-d1b4-4d55-b299-5bd39f80dba6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:14.811 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c5c9974c-d1b4-4d55-b299-5bd39f80dba6 in datapath ef8c2eaa-a409-4066-a902-4b34ecd6ca56 unbound from our chassis
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:14.812 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef8c2eaa-a409-4066-a902-4b34ecd6ca56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:33:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:14.813 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c404c0-9e4e-4405-81ec-e9cf1476c7a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:14.814 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56 namespace which is not needed anymore
Sep 30 21:33:14 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000056.scope: Deactivated successfully.
Sep 30 21:33:14 compute-0 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000056.scope: Consumed 17.018s CPU time.
Sep 30 21:33:14 compute-0 systemd-machined[152794]: Machine qemu-41-instance-00000056 terminated.
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [NOTICE]   (233616) : haproxy version is 2.8.14-c23fe91
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [NOTICE]   (233616) : path to executable is /usr/sbin/haproxy
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [WARNING]  (233616) : Exiting Master process...
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [WARNING]  (233616) : Exiting Master process...
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [ALERT]    (233616) : Current worker (233618) exited with code 143 (Terminated)
Sep 30 21:33:14 compute-0 neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56[233612]: [WARNING]  (233616) : All workers exited. Exiting... (0)
Sep 30 21:33:14 compute-0 systemd[1]: libpod-5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c.scope: Deactivated successfully.
Sep 30 21:33:14 compute-0 podman[234101]: 2025-09-30 21:33:14.936222724 +0000 UTC m=+0.043736771 container died 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:33:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:33:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-f018d9171b43cb8cdbf31dd1252e55d17f8c04c45fb43ff9c938773d2221aae1-merged.mount: Deactivated successfully.
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 nova_compute[192810]: 2025-09-30 21:33:14.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:14 compute-0 podman[234101]: 2025-09-30 21:33:14.974248924 +0000 UTC m=+0.081763011 container cleanup 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:33:14 compute-0 systemd[1]: libpod-conmon-5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c.scope: Deactivated successfully.
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.002 2 INFO nova.virt.libvirt.driver [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Instance destroyed successfully.
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.002 2 DEBUG nova.objects.instance [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lazy-loading 'resources' on Instance uuid f4cfcb8f-ae48-41bf-b39c-597a639f3a68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.016 2 DEBUG nova.virt.libvirt.vif [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-SecurityGroupsTestJSON-server-2099014552',display_name='tempest-SecurityGroupsTestJSON-server-2099014552',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-securitygroupstestjson-server-2099014552',id=86,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:32:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f2302f440e044e8be2476ceca24a473',ramdisk_id='',reservation_id='r-by2uio6p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-SecurityGroupsTestJSON-1418534006',owner_user_name='tempest-SecurityGroupsTestJSON-1418534006-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:32:23Z,user_data=None,user_id='002d2f39090c4281bae7f4aba96cef9f',uuid=f4cfcb8f-ae48-41bf-b39c-597a639f3a68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.016 2 DEBUG nova.network.os_vif_util [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converting VIF {"id": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "address": "fa:16:3e:e6:b3:a9", "network": {"id": "ef8c2eaa-a409-4066-a902-4b34ecd6ca56", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-1075020687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f2302f440e044e8be2476ceca24a473", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5c9974c-d1", "ovs_interfaceid": "c5c9974c-d1b4-4d55-b299-5bd39f80dba6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.017 2 DEBUG nova.network.os_vif_util [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.017 2 DEBUG os_vif [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.019 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5c9974c-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.024 2 INFO os_vif [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e6:b3:a9,bridge_name='br-int',has_traffic_filtering=True,id=c5c9974c-d1b4-4d55-b299-5bd39f80dba6,network=Network(ef8c2eaa-a409-4066-a902-4b34ecd6ca56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5c9974c-d1')
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.024 2 INFO nova.virt.libvirt.driver [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Deleting instance files /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68_del
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.025 2 INFO nova.virt.libvirt.driver [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Deletion of /var/lib/nova/instances/f4cfcb8f-ae48-41bf-b39c-597a639f3a68_del complete
Sep 30 21:33:15 compute-0 podman[234148]: 2025-09-30 21:33:15.078493405 +0000 UTC m=+0.056479273 container remove 5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.087 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3cb23f-0687-465f-ab4f-f24714629b37]: (4, ('Tue Sep 30 09:33:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56 (5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c)\n5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c\nTue Sep 30 09:33:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56 (5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c)\n5971f12b356d56a88372bbf44d5a8a7ff313de934d10ff7f88544bed520a201c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.090 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[176ec1a8-b147-446d-ac68-c3be82a95989]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.090 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef8c2eaa-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:15 compute-0 kernel: tapef8c2eaa-a0: left promiscuous mode
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.105 2 INFO nova.compute.manager [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.105 2 DEBUG oslo.service.loopingcall [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.106 2 DEBUG nova.compute.manager [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.106 2 DEBUG nova.network.neutron [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.108 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fee8a9fe-5def-4ed1-9939-b64e09996e8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG nova.compute.manager [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-unplugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG oslo_concurrency.lockutils [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG oslo_concurrency.lockutils [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG oslo_concurrency.lockutils [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG nova.compute.manager [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] No waiting events found dispatching network-vif-unplugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.134 2 DEBUG nova.compute.manager [req-8040ceb3-c48f-4280-8401-133b5c0dfb5a req-e1deb85e-f393-48e0-8b0b-f5d73be70e23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-unplugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.143 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c746e4-2b2b-415a-bb4a-baa3cdb3118b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.144 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[937b17fc-28ee-4aa5-9aba-2eb5262fd3f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.159 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[387e5c25-1fd4-47ef-8c1c-c7eed0e0ced5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463750, 'reachable_time': 36183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234164, 'error': None, 'target': 'ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.160 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef8c2eaa-a409-4066-a902-4b34ecd6ca56 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:33:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:15.161 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a64a0d-e799-42a0-b884-217b429fa495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:33:15 compute-0 systemd[1]: run-netns-ovnmeta\x2def8c2eaa\x2da409\x2d4066\x2da902\x2d4b34ecd6ca56.mount: Deactivated successfully.
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.808 2 DEBUG nova.network.neutron [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.836 2 INFO nova.compute.manager [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Took 0.73 seconds to deallocate network for instance.
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.914 2 DEBUG nova.compute.manager [req-f2d746a7-ccbe-47bc-9452-09eb422d282f req-9a8aeca4-26d0-4ad4-b667-27fdc785a4a3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-deleted-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.925 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:15 compute-0 nova_compute[192810]: 2025-09-30 21:33:15.926 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:15 compute-0 ovn_controller[94912]: 2025-09-30T21:33:15Z|00342|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.016 2 DEBUG nova.compute.provider_tree [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.034 2 DEBUG nova.scheduler.client.report [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.068 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.095 2 INFO nova.scheduler.client.report [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Deleted allocations for instance f4cfcb8f-ae48-41bf-b39c-597a639f3a68
Sep 30 21:33:16 compute-0 nova_compute[192810]: 2025-09-30 21:33:16.193 2 DEBUG oslo_concurrency.lockutils [None req-f0ebc0b7-2f27-4cc0-9c25-cfa2ec27f9f4 002d2f39090c4281bae7f4aba96cef9f 9f2302f440e044e8be2476ceca24a473 - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.278 2 DEBUG nova.compute.manager [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.278 2 DEBUG oslo_concurrency.lockutils [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.278 2 DEBUG oslo_concurrency.lockutils [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.279 2 DEBUG oslo_concurrency.lockutils [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f4cfcb8f-ae48-41bf-b39c-597a639f3a68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.279 2 DEBUG nova.compute.manager [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] No waiting events found dispatching network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.279 2 WARNING nova.compute.manager [req-2001f54c-d8c7-4bd2-ae92-e09cb5623bca req-194296be-c70d-49a1-9c57-c702eb34bccf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Received unexpected event network-vif-plugged-c5c9974c-d1b4-4d55-b299-5bd39f80dba6 for instance with vm_state deleted and task_state None.
Sep 30 21:33:17 compute-0 nova_compute[192810]: 2025-09-30 21:33:17.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-0 nova_compute[192810]: 2025-09-30 21:33:20.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-0 ovn_controller[94912]: 2025-09-30T21:33:20Z|00343|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:33:20 compute-0 nova_compute[192810]: 2025-09-30 21:33:20.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:20 compute-0 podman[234167]: 2025-09-30 21:33:20.34149454 +0000 UTC m=+0.062724026 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:33:20 compute-0 podman[234166]: 2025-09-30 21:33:20.367149658 +0000 UTC m=+0.090664509 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:33:22 compute-0 podman[234212]: 2025-09-30 21:33:22.33468599 +0000 UTC m=+0.068350853 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:33:22 compute-0 nova_compute[192810]: 2025-09-30 21:33:22.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:23 compute-0 nova_compute[192810]: 2025-09-30 21:33:23.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:23 compute-0 unix_chkpwd[234234]: password check failed for user (root)
Sep 30 21:33:23 compute-0 sshd-session[234232]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:33:24 compute-0 sshd-session[234232]: Failed password for root from 45.81.23.80 port 39644 ssh2
Sep 30 21:33:25 compute-0 nova_compute[192810]: 2025-09-30 21:33:25.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:25 compute-0 sshd-session[234232]: Received disconnect from 45.81.23.80 port 39644:11: Bye Bye [preauth]
Sep 30 21:33:25 compute-0 sshd-session[234232]: Disconnected from authenticating user root 45.81.23.80 port 39644 [preauth]
Sep 30 21:33:25 compute-0 nova_compute[192810]: 2025-09-30 21:33:25.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:27 compute-0 nova_compute[192810]: 2025-09-30 21:33:27.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:27 compute-0 nova_compute[192810]: 2025-09-30 21:33:27.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-0 nova_compute[192810]: 2025-09-30 21:33:30.002 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759267994.9999874, f4cfcb8f-ae48-41bf-b39c-597a639f3a68 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:33:30 compute-0 nova_compute[192810]: 2025-09-30 21:33:30.003 2 INFO nova.compute.manager [-] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] VM Stopped (Lifecycle Event)
Sep 30 21:33:30 compute-0 nova_compute[192810]: 2025-09-30 21:33:30.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:30 compute-0 nova_compute[192810]: 2025-09-30 21:33:30.028 2 DEBUG nova.compute.manager [None req-d6446528-0dc7-43bf-a632-bd1eba7d15b4 - - - - - -] [instance: f4cfcb8f-ae48-41bf-b39c-597a639f3a68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:33:32 compute-0 podman[234235]: 2025-09-30 21:33:32.329340303 +0000 UTC m=+0.064539760 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:33:32 compute-0 podman[234236]: 2025-09-30 21:33:32.341635964 +0000 UTC m=+0.072534246 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc.)
Sep 30 21:33:32 compute-0 nova_compute[192810]: 2025-09-30 21:33:32.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:32 compute-0 ovn_controller[94912]: 2025-09-30T21:33:32Z|00344|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:33:32 compute-0 nova_compute[192810]: 2025-09-30 21:33:32.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:34 compute-0 nova_compute[192810]: 2025-09-30 21:33:34.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:35 compute-0 nova_compute[192810]: 2025-09-30 21:33:35.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:37 compute-0 nova_compute[192810]: 2025-09-30 21:33:37.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:38.739 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:38.739 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:38.740 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:39 compute-0 podman[234280]: 2025-09-30 21:33:39.33837114 +0000 UTC m=+0.074862713 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:33:39 compute-0 podman[234282]: 2025-09-30 21:33:39.344776516 +0000 UTC m=+0.070264640 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:33:39 compute-0 podman[234281]: 2025-09-30 21:33:39.357762094 +0000 UTC m=+0.090436454 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid)
Sep 30 21:33:40 compute-0 nova_compute[192810]: 2025-09-30 21:33:40.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:42 compute-0 nova_compute[192810]: 2025-09-30 21:33:42.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:43 compute-0 nova_compute[192810]: 2025-09-30 21:33:43.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:45 compute-0 nova_compute[192810]: 2025-09-30 21:33:45.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:45 compute-0 nova_compute[192810]: 2025-09-30 21:33:45.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:46.115 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:33:46 compute-0 nova_compute[192810]: 2025-09-30 21:33:46.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:46.116 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:33:47 compute-0 nova_compute[192810]: 2025-09-30 21:33:47.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:47 compute-0 nova_compute[192810]: 2025-09-30 21:33:47.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:48 compute-0 nova_compute[192810]: 2025-09-30 21:33:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:49 compute-0 nova_compute[192810]: 2025-09-30 21:33:49.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:50 compute-0 nova_compute[192810]: 2025-09-30 21:33:50.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:51 compute-0 podman[234347]: 2025-09-30 21:33:51.320643884 +0000 UTC m=+0.052917966 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm)
Sep 30 21:33:51 compute-0 podman[234346]: 2025-09-30 21:33:51.348303151 +0000 UTC m=+0.084599601 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.822 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.822 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.822 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.822 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.874 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.929 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.930 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:33:51 compute-0 nova_compute[192810]: 2025-09-30 21:33:51.990 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.150 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.151 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5573MB free_disk=73.28979110717773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.151 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.152 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.238 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance f92e449c-90c2-4cba-a8c1-ba6b1c82d770 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.239 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.239 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.288 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.302 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.324 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.325 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:33:52 compute-0 nova_compute[192810]: 2025-09-30 21:33:52.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:53 compute-0 podman[234397]: 2025-09-30 21:33:53.311546787 +0000 UTC m=+0.047126514 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.320 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.320 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.320 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.320 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.859 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.859 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.859 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:33:54 compute-0 nova_compute[192810]: 2025-09-30 21:33:54.860 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid f92e449c-90c2-4cba-a8c1-ba6b1c82d770 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:33:55 compute-0 nova_compute[192810]: 2025-09-30 21:33:55.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:33:56.118 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.195 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updating instance_info_cache with network_info: [{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.226 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.227 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.227 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.228 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:33:57 compute-0 nova_compute[192810]: 2025-09-30 21:33:57.690 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:00 compute-0 nova_compute[192810]: 2025-09-30 21:34:00.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:02 compute-0 nova_compute[192810]: 2025-09-30 21:34:02.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:03 compute-0 podman[234419]: 2025-09-30 21:34:03.32175923 +0000 UTC m=+0.061244779 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 21:34:03 compute-0 podman[234418]: 2025-09-30 21:34:03.324446266 +0000 UTC m=+0.064066309 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.663 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.663 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.695 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.792 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.793 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.799 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.799 2 INFO nova.compute.claims [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.935 2 DEBUG nova.compute.provider_tree [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.951 2 DEBUG nova.scheduler.client.report [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.978 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:05 compute-0 nova_compute[192810]: 2025-09-30 21:34:05.978 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.227 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.227 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.256 2 INFO nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.272 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.778 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.780 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.780 2 INFO nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Creating image(s)
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.781 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.781 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.782 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.794 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.877 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.878 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.878 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.889 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.910 2 DEBUG nova.policy [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c27a02706b9d43ffaab2c5fa833fec04', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b33d27d5088343569f4459643d0da580', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.961 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.962 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.996 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.997 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:06 compute-0 nova_compute[192810]: 2025-09-30 21:34:06.998 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.060 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.061 2 DEBUG nova.virt.disk.api [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Checking if we can resize image /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.061 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.112 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.113 2 DEBUG nova.virt.disk.api [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Cannot resize image /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.113 2 DEBUG nova.objects.instance [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b2e5fac-6b68-407e-a326-34ec3b940ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.130 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.131 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Ensure instance console log exists: /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.131 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.132 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.132 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:07 compute-0 nova_compute[192810]: 2025-09-30 21:34:07.975 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Successfully created port: df0efe7c-eb87-4cc3-ac77-a663d146fa66 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.204 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Successfully updated port: df0efe7c-eb87-4cc3-ac77-a663d146fa66 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.231 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.232 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquired lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.232 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.372 2 DEBUG nova.compute.manager [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-changed-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.373 2 DEBUG nova.compute.manager [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Refreshing instance network info cache due to event network-changed-df0efe7c-eb87-4cc3-ac77-a663d146fa66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.373 2 DEBUG oslo_concurrency.lockutils [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:09 compute-0 nova_compute[192810]: 2025-09-30 21:34:09.481 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:10 compute-0 podman[234481]: 2025-09-30 21:34:10.327723434 +0000 UTC m=+0.054676259 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:34:10 compute-0 podman[234479]: 2025-09-30 21:34:10.345422487 +0000 UTC m=+0.079357783 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:34:10 compute-0 podman[234480]: 2025-09-30 21:34:10.347340994 +0000 UTC m=+0.074626697 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.905 2 DEBUG nova.network.neutron [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updating instance_info_cache with network_info: [{"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.939 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Releasing lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.939 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Instance network_info: |[{"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.939 2 DEBUG oslo_concurrency.lockutils [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.940 2 DEBUG nova.network.neutron [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Refreshing network info cache for port df0efe7c-eb87-4cc3-ac77-a663d146fa66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.942 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Start _get_guest_xml network_info=[{"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.946 2 WARNING nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.950 2 DEBUG nova.virt.libvirt.host [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.950 2 DEBUG nova.virt.libvirt.host [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.953 2 DEBUG nova.virt.libvirt.host [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.953 2 DEBUG nova.virt.libvirt.host [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.954 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.954 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.955 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.955 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.955 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.955 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.955 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.956 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.956 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.956 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.956 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.957 2 DEBUG nova.virt.hardware [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.960 2 DEBUG nova.virt.libvirt.vif [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-277035110',display_name='tempest-ServerActionsTestOtherA-server-277035110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-277035110',id=96,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-07uljxyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:06Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=4b2e5fac-6b68-407e-a326-34ec3b940ed7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.960 2 DEBUG nova.network.os_vif_util [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.961 2 DEBUG nova.network.os_vif_util [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.962 2 DEBUG nova.objects.instance [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b2e5fac-6b68-407e-a326-34ec3b940ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.984 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <uuid>4b2e5fac-6b68-407e-a326-34ec3b940ed7</uuid>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <name>instance-00000060</name>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerActionsTestOtherA-server-277035110</nova:name>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:34:10</nova:creationTime>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:user uuid="c27a02706b9d43ffaab2c5fa833fec04">tempest-ServerActionsTestOtherA-553860193-project-member</nova:user>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:project uuid="b33d27d5088343569f4459643d0da580">tempest-ServerActionsTestOtherA-553860193</nova:project>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         <nova:port uuid="df0efe7c-eb87-4cc3-ac77-a663d146fa66">
Sep 30 21:34:10 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <system>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="serial">4b2e5fac-6b68-407e-a326-34ec3b940ed7</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="uuid">4b2e5fac-6b68-407e-a326-34ec3b940ed7</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </system>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <os>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </os>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <features>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </features>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.config"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:ec:55:eb"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <target dev="tapdf0efe7c-eb"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/console.log" append="off"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <video>
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </video>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:34:10 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:34:10 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:34:10 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:34:10 compute-0 nova_compute[192810]: </domain>
Sep 30 21:34:10 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.985 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Preparing to wait for external event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.986 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.986 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.986 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.987 2 DEBUG nova.virt.libvirt.vif [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-277035110',display_name='tempest-ServerActionsTestOtherA-server-277035110',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-277035110',id=96,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-07uljxyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:06Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=4b2e5fac-6b68-407e-a326-34ec3b940ed7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.987 2 DEBUG nova.network.os_vif_util [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.988 2 DEBUG nova.network.os_vif_util [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.989 2 DEBUG os_vif [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.989 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.990 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf0efe7c-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdf0efe7c-eb, col_values=(('external_ids', {'iface-id': 'df0efe7c-eb87-4cc3-ac77-a663d146fa66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:55:eb', 'vm-uuid': '4b2e5fac-6b68-407e-a326-34ec3b940ed7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:10 compute-0 NetworkManager[51733]: <info>  [1759268050.9967] manager: (tapdf0efe7c-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Sep 30 21:34:10 compute-0 nova_compute[192810]: 2025-09-30 21:34:10.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.004 2 INFO os_vif [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb')
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.103 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.104 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.104 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] No VIF found with MAC fa:16:3e:ec:55:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.104 2 INFO nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Using config drive
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.817 2 INFO nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Creating config drive at /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.config
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.822 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqoxlonc1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:11 compute-0 nova_compute[192810]: 2025-09-30 21:34:11.946 2 DEBUG oslo_concurrency.processutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqoxlonc1" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:11 compute-0 kernel: tapdf0efe7c-eb: entered promiscuous mode
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.0004] manager: (tapdf0efe7c-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/165)
Sep 30 21:34:12 compute-0 ovn_controller[94912]: 2025-09-30T21:34:12Z|00345|binding|INFO|Claiming lport df0efe7c-eb87-4cc3-ac77-a663d146fa66 for this chassis.
Sep 30 21:34:12 compute-0 ovn_controller[94912]: 2025-09-30T21:34:12Z|00346|binding|INFO|df0efe7c-eb87-4cc3-ac77-a663d146fa66: Claiming fa:16:3e:ec:55:eb 10.100.0.7
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.007 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:55:eb 10.100.0.7'], port_security=['fa:16:3e:ec:55:eb 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4b2e5fac-6b68-407e-a326-34ec3b940ed7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e317a017-785e-4ba5-91f5-e79d51ecd764', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=df0efe7c-eb87-4cc3-ac77-a663d146fa66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.008 103867 INFO neutron.agent.ovn.metadata.agent [-] Port df0efe7c-eb87-4cc3-ac77-a663d146fa66 in datapath 05b270a8-0653-4995-ab43-826254451140 bound to our chassis
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.010 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:34:12 compute-0 ovn_controller[94912]: 2025-09-30T21:34:12Z|00347|binding|INFO|Setting lport df0efe7c-eb87-4cc3-ac77-a663d146fa66 ovn-installed in OVS
Sep 30 21:34:12 compute-0 ovn_controller[94912]: 2025-09-30T21:34:12Z|00348|binding|INFO|Setting lport df0efe7c-eb87-4cc3-ac77-a663d146fa66 up in Southbound
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.020 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c541cddc-23b4-4697-b0d6-e460effd7221]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.021 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap05b270a8-01 in ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.022 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap05b270a8-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.022 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c6dceb-9d48-490b-b6fe-b9a246a09bb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.023 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a745a12f-a168-46e6-8100-cb1f1712acbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.033 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b731ae-8f4f-4016-abce-d3c63efb1cc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 systemd-udevd[234564]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:12 compute-0 systemd-machined[152794]: New machine qemu-43-instance-00000060.
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.055 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[923e42e1-e90b-4a68-90f0-a5af1ec1d433]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.0595] device (tapdf0efe7c-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.0605] device (tapdf0efe7c-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:34:12 compute-0 systemd[1]: Started Virtual Machine qemu-43-instance-00000060.
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.083 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfc81d8-c1a5-4f1a-aa4a-2a28df71b768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.0891] manager: (tap05b270a8-00): new Veth device (/org/freedesktop/NetworkManager/Devices/166)
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.088 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[90f6cdcd-fd8c-467e-b8ef-aae0639d37d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.117 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8478a0c4-8483-48b9-a4cf-c1b71dc56dfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.120 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[07a171e0-19b3-4498-9af8-3d0037aee214]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.1430] device (tap05b270a8-00): carrier: link connected
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.146 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c43db08e-9d34-402f-a0c3-e4938ac2995c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.161 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a41cd860-4242-45f5-8510-82317951f543]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474776, 'reachable_time': 17251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234594, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.175 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b1a66586-a611-40a8-95cc-b27bed9641d3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:8b80'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474776, 'tstamp': 474776}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234595, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.190 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0f97b00f-f703-4eff-878a-777f3f6de206]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap05b270a8-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:8b:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474776, 'reachable_time': 17251, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234596, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.217 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c4965d72-8188-4701-a7b5-4b976cd28f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.271 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2b465ca2-03ed-4047-aec4-7316d0be067f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.273 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.273 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.274 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap05b270a8-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 kernel: tap05b270a8-00: entered promiscuous mode
Sep 30 21:34:12 compute-0 NetworkManager[51733]: <info>  [1759268052.3357] manager: (tap05b270a8-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/167)
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.339 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap05b270a8-00, col_values=(('external_ids', {'iface-id': '28bea95f-2c3a-4d33-8bad-3373e1efde4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_controller[94912]: 2025-09-30T21:34:12Z|00349|binding|INFO|Releasing lport 28bea95f-2c3a-4d33-8bad-3373e1efde4f from this chassis (sb_readonly=0)
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.343 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.344 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[09f11f7f-f291-4fa3-b44f-29e7c37a72b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.344 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-05b270a8-0653-4995-ab43-826254451140
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/05b270a8-0653-4995-ab43-826254451140.pid.haproxy
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 05b270a8-0653-4995-ab43-826254451140
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:34:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:12.345 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'env', 'PROCESS_TAG=haproxy-05b270a8-0653-4995-ab43-826254451140', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/05b270a8-0653-4995-ab43-826254451140.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:12 compute-0 podman[234642]: 2025-09-30 21:34:12.742671264 +0000 UTC m=+0.062296595 container create 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:34:12 compute-0 systemd[1]: Started libpod-conmon-04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993.scope.
Sep 30 21:34:12 compute-0 podman[234642]: 2025-09-30 21:34:12.701909937 +0000 UTC m=+0.021535308 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:34:12 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:34:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/549c4c186955a4eb97523b6973359fddfd6512b4f46024949d50d0f7ee57266b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:34:12 compute-0 podman[234642]: 2025-09-30 21:34:12.820451177 +0000 UTC m=+0.140076518 container init 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:34:12 compute-0 podman[234642]: 2025-09-30 21:34:12.827484709 +0000 UTC m=+0.147110040 container start 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:34:12 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [NOTICE]   (234661) : New worker (234663) forked
Sep 30 21:34:12 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [NOTICE]   (234661) : Loading success.
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.877 2 DEBUG nova.network.neutron [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updated VIF entry in instance network info cache for port df0efe7c-eb87-4cc3-ac77-a663d146fa66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.878 2 DEBUG nova.network.neutron [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updating instance_info_cache with network_info: [{"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:12 compute-0 nova_compute[192810]: 2025-09-30 21:34:12.902 2 DEBUG oslo_concurrency.lockutils [req-50430622-d552-402f-8c59-1a3b8d176f8a req-b029192e-3dad-41e4-a7ab-b55065c043c6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.419 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268053.418546, 4b2e5fac-6b68-407e-a326-34ec3b940ed7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.420 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] VM Started (Lifecycle Event)
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.445 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.448 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268053.4192905, 4b2e5fac-6b68-407e-a326-34ec3b940ed7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.448 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] VM Paused (Lifecycle Event)
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.468 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.470 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:13 compute-0 nova_compute[192810]: 2025-09-30 21:34:13.489 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.330 2 DEBUG nova.compute.manager [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.332 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.332 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.333 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.333 2 DEBUG nova.compute.manager [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Processing event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.333 2 DEBUG nova.compute.manager [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.333 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.334 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.334 2 DEBUG oslo_concurrency.lockutils [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.334 2 DEBUG nova.compute.manager [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] No waiting events found dispatching network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.334 2 WARNING nova.compute.manager [req-fd4e5928-fa8a-4e1a-a964-9a49f2195fec req-b738efd4-565d-4802-9167-86082369c6dc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received unexpected event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 for instance with vm_state building and task_state spawning.
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.335 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.340 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268054.3397434, 4b2e5fac-6b68-407e-a326-34ec3b940ed7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.340 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] VM Resumed (Lifecycle Event)
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.341 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.344 2 INFO nova.virt.libvirt.driver [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Instance spawned successfully.
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.344 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.374 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.379 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.398 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.399 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.400 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.400 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.401 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.402 2 DEBUG nova.virt.libvirt.driver [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.502 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.844 2 INFO nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Took 8.07 seconds to spawn the instance on the hypervisor.
Sep 30 21:34:14 compute-0 nova_compute[192810]: 2025-09-30 21:34:14.845 2 DEBUG nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:15 compute-0 nova_compute[192810]: 2025-09-30 21:34:15.177 2 INFO nova.compute.manager [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Took 9.42 seconds to build instance.
Sep 30 21:34:15 compute-0 nova_compute[192810]: 2025-09-30 21:34:15.343 2 DEBUG oslo_concurrency.lockutils [None req-70f62bcd-feea-41d3-b70b-3b93a0687804 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:15 compute-0 nova_compute[192810]: 2025-09-30 21:34:15.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:16 compute-0 ovn_controller[94912]: 2025-09-30T21:34:16Z|00350|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:34:16 compute-0 ovn_controller[94912]: 2025-09-30T21:34:16Z|00351|binding|INFO|Releasing lport 28bea95f-2c3a-4d33-8bad-3373e1efde4f from this chassis (sb_readonly=0)
Sep 30 21:34:16 compute-0 nova_compute[192810]: 2025-09-30 21:34:16.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.784 2 DEBUG nova.compute.manager [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-changed-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.784 2 DEBUG nova.compute.manager [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Refreshing instance network info cache due to event network-changed-df0efe7c-eb87-4cc3-ac77-a663d146fa66. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.784 2 DEBUG oslo_concurrency.lockutils [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.785 2 DEBUG oslo_concurrency.lockutils [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:17 compute-0 nova_compute[192810]: 2025-09-30 21:34:17.785 2 DEBUG nova.network.neutron [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Refreshing network info cache for port df0efe7c-eb87-4cc3-ac77-a663d146fa66 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:34:17 compute-0 unix_chkpwd[234681]: password check failed for user (root)
Sep 30 21:34:17 compute-0 sshd-session[234679]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:34:19 compute-0 sshd-session[234679]: Failed password for root from 45.81.23.80 port 34490 ssh2
Sep 30 21:34:20 compute-0 nova_compute[192810]: 2025-09-30 21:34:20.658 2 DEBUG nova.network.neutron [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updated VIF entry in instance network info cache for port df0efe7c-eb87-4cc3-ac77-a663d146fa66. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:20 compute-0 nova_compute[192810]: 2025-09-30 21:34:20.658 2 DEBUG nova.network.neutron [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updating instance_info_cache with network_info: [{"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:20 compute-0 nova_compute[192810]: 2025-09-30 21:34:20.678 2 DEBUG oslo_concurrency.lockutils [req-991361bb-c7e4-48c9-9976-57bce38d5c73 req-19017f9e-9528-4b1b-bc83-dc1efc3426f1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4b2e5fac-6b68-407e-a326-34ec3b940ed7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:20 compute-0 nova_compute[192810]: 2025-09-30 21:34:20.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:21 compute-0 sshd-session[234679]: Received disconnect from 45.81.23.80 port 34490:11: Bye Bye [preauth]
Sep 30 21:34:21 compute-0 sshd-session[234679]: Disconnected from authenticating user root 45.81.23.80 port 34490 [preauth]
Sep 30 21:34:22 compute-0 podman[234684]: 2025-09-30 21:34:22.320455826 +0000 UTC m=+0.055803537 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:34:22 compute-0 podman[234683]: 2025-09-30 21:34:22.398224319 +0000 UTC m=+0.133716873 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:34:22 compute-0 nova_compute[192810]: 2025-09-30 21:34:22.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.556 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.557 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.557 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.558 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.558 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.573 2 INFO nova.compute.manager [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Terminating instance
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.589 2 DEBUG nova.compute.manager [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:34:23 compute-0 kernel: tapdf0efe7c-eb (unregistering): left promiscuous mode
Sep 30 21:34:23 compute-0 NetworkManager[51733]: <info>  [1759268063.6069] device (tapdf0efe7c-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 ovn_controller[94912]: 2025-09-30T21:34:23Z|00352|binding|INFO|Releasing lport df0efe7c-eb87-4cc3-ac77-a663d146fa66 from this chassis (sb_readonly=0)
Sep 30 21:34:23 compute-0 ovn_controller[94912]: 2025-09-30T21:34:23Z|00353|binding|INFO|Setting lport df0efe7c-eb87-4cc3-ac77-a663d146fa66 down in Southbound
Sep 30 21:34:23 compute-0 ovn_controller[94912]: 2025-09-30T21:34:23Z|00354|binding|INFO|Removing iface tapdf0efe7c-eb ovn-installed in OVS
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.631 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:55:eb 10.100.0.7'], port_security=['fa:16:3e:ec:55:eb 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '4b2e5fac-6b68-407e-a326-34ec3b940ed7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05b270a8-0653-4995-ab43-826254451140', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b33d27d5088343569f4459643d0da580', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=babcc0e7-e8b1-4d4c-a7eb-2970ba0dde6e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=df0efe7c-eb87-4cc3-ac77-a663d146fa66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.633 103867 INFO neutron.agent.ovn.metadata.agent [-] Port df0efe7c-eb87-4cc3-ac77-a663d146fa66 in datapath 05b270a8-0653-4995-ab43-826254451140 unbound from our chassis
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.634 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 05b270a8-0653-4995-ab43-826254451140, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.635 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[163164ca-ac68-4244-950c-948f10569022]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.635 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-05b270a8-0653-4995-ab43-826254451140 namespace which is not needed anymore
Sep 30 21:34:23 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000060.scope: Deactivated successfully.
Sep 30 21:34:23 compute-0 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000060.scope: Consumed 10.684s CPU time.
Sep 30 21:34:23 compute-0 systemd-machined[152794]: Machine qemu-43-instance-00000060 terminated.
Sep 30 21:34:23 compute-0 podman[234730]: 2025-09-30 21:34:23.687626257 +0000 UTC m=+0.053046679 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Sep 30 21:34:23 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [NOTICE]   (234661) : haproxy version is 2.8.14-c23fe91
Sep 30 21:34:23 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [NOTICE]   (234661) : path to executable is /usr/sbin/haproxy
Sep 30 21:34:23 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [WARNING]  (234661) : Exiting Master process...
Sep 30 21:34:23 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [ALERT]    (234661) : Current worker (234663) exited with code 143 (Terminated)
Sep 30 21:34:23 compute-0 neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140[234657]: [WARNING]  (234661) : All workers exited. Exiting... (0)
Sep 30 21:34:23 compute-0 systemd[1]: libpod-04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993.scope: Deactivated successfully.
Sep 30 21:34:23 compute-0 podman[234770]: 2025-09-30 21:34:23.798285895 +0000 UTC m=+0.074202827 container died 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.834 2 DEBUG nova.compute.manager [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-unplugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.834 2 DEBUG oslo_concurrency.lockutils [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.835 2 DEBUG oslo_concurrency.lockutils [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.835 2 DEBUG oslo_concurrency.lockutils [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.835 2 DEBUG nova.compute.manager [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] No waiting events found dispatching network-vif-unplugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.835 2 DEBUG nova.compute.manager [req-293ebfdc-c14a-44e1-bb82-e00ddf5660e1 req-57f66e92-b783-4b06-847a-7da7348c199f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-unplugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.846 2 INFO nova.virt.libvirt.driver [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Instance destroyed successfully.
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.847 2 DEBUG nova.objects.instance [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lazy-loading 'resources' on Instance uuid 4b2e5fac-6b68-407e-a326-34ec3b940ed7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.863 2 DEBUG nova.virt.libvirt.vif [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:34:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-277035110',display_name='tempest-ServerActionsTestOtherA-server-277035110',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-277035110',id=96,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:34:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b33d27d5088343569f4459643d0da580',ramdisk_id='',reservation_id='r-07uljxyt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-553860193',owner_user_name='tempest-ServerActionsTestOtherA-553860193-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:34:14Z,user_data=None,user_id='c27a02706b9d43ffaab2c5fa833fec04',uuid=4b2e5fac-6b68-407e-a326-34ec3b940ed7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.863 2 DEBUG nova.network.os_vif_util [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converting VIF {"id": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "address": "fa:16:3e:ec:55:eb", "network": {"id": "05b270a8-0653-4995-ab43-826254451140", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1761374743-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b33d27d5088343569f4459643d0da580", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdf0efe7c-eb", "ovs_interfaceid": "df0efe7c-eb87-4cc3-ac77-a663d146fa66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.864 2 DEBUG nova.network.os_vif_util [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.865 2 DEBUG os_vif [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993-userdata-shm.mount: Deactivated successfully.
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.866 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf0efe7c-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-549c4c186955a4eb97523b6973359fddfd6512b4f46024949d50d0f7ee57266b-merged.mount: Deactivated successfully.
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.874 2 INFO os_vif [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ec:55:eb,bridge_name='br-int',has_traffic_filtering=True,id=df0efe7c-eb87-4cc3-ac77-a663d146fa66,network=Network(05b270a8-0653-4995-ab43-826254451140),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdf0efe7c-eb')
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.874 2 INFO nova.virt.libvirt.driver [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Deleting instance files /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7_del
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.875 2 INFO nova.virt.libvirt.driver [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Deletion of /var/lib/nova/instances/4b2e5fac-6b68-407e-a326-34ec3b940ed7_del complete
Sep 30 21:34:23 compute-0 podman[234770]: 2025-09-30 21:34:23.911010803 +0000 UTC m=+0.186927745 container cleanup 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:23.915 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:23 compute-0 systemd[1]: libpod-conmon-04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993.scope: Deactivated successfully.
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.950 2 INFO nova.compute.manager [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.951 2 DEBUG oslo.service.loopingcall [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.951 2 DEBUG nova.compute.manager [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:34:23 compute-0 nova_compute[192810]: 2025-09-30 21:34:23.952 2 DEBUG nova.network.neutron [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:34:24 compute-0 podman[234816]: 2025-09-30 21:34:24.022812099 +0000 UTC m=+0.088552668 container remove 04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.027 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb0b494-1898-428b-b5be-51f80c6fa26a]: (4, ('Tue Sep 30 09:34:23 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993)\n04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993\nTue Sep 30 09:34:23 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-05b270a8-0653-4995-ab43-826254451140 (04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993)\n04fe7af1b458a3cbd13fc5c307d90329350be0d99b844fcaf5fb31c208c50993\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.029 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7f941990-7633-4197-93d7-68e3851f7ea2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.030 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap05b270a8-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:24 compute-0 kernel: tap05b270a8-00: left promiscuous mode
Sep 30 21:34:24 compute-0 nova_compute[192810]: 2025-09-30 21:34:24.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.036 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[41f73aab-bfa9-4901-88e4-2bca54664faf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 nova_compute[192810]: 2025-09-30 21:34:24.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.067 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5da1214b-a00f-498b-9855-974e9f1ee7e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.068 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0baef6f6-1d64-40f3-8864-23bfcbc27750]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.082 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a2096da5-331d-4288-8536-d0a4829becf1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474770, 'reachable_time': 39696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234831, 'error': None, 'target': 'ovnmeta-05b270a8-0653-4995-ab43-826254451140', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.084 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-05b270a8-0653-4995-ab43-826254451140 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.085 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d8cb80a0-ca40-4dc8-a449-4873ab7c3059]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:24.085 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:34:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d05b270a8\x2d0653\x2d4995\x2dab43\x2d826254451140.mount: Deactivated successfully.
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.939 2 DEBUG nova.compute.manager [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.940 2 DEBUG oslo_concurrency.lockutils [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.941 2 DEBUG oslo_concurrency.lockutils [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.941 2 DEBUG oslo_concurrency.lockutils [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.942 2 DEBUG nova.compute.manager [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] No waiting events found dispatching network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:25 compute-0 nova_compute[192810]: 2025-09-30 21:34:25.942 2 WARNING nova.compute.manager [req-1056923f-c6a4-4f19-b110-c3da2f45b8c3 req-b5f30001-0f79-44ae-bd12-8753db56f426 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received unexpected event network-vif-plugged-df0efe7c-eb87-4cc3-ac77-a663d146fa66 for instance with vm_state active and task_state deleting.
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.180 2 DEBUG nova.network.neutron [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.202 2 INFO nova.compute.manager [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Took 2.25 seconds to deallocate network for instance.
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.258 2 DEBUG nova.compute.manager [req-c8115cdb-8c02-4eb1-b1a5-1481e3c8ff58 req-c74d2d0d-ee99-433e-a77e-b1a106767c28 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Received event network-vif-deleted-df0efe7c-eb87-4cc3-ac77-a663d146fa66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.277 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.277 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.341 2 DEBUG nova.compute.provider_tree [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.362 2 DEBUG nova.scheduler.client.report [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.383 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.425 2 INFO nova.scheduler.client.report [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Deleted allocations for instance 4b2e5fac-6b68-407e-a326-34ec3b940ed7
Sep 30 21:34:26 compute-0 nova_compute[192810]: 2025-09-30 21:34:26.495 2 DEBUG oslo_concurrency.lockutils [None req-be222e5f-3d43-4b74-9f91-10e65e1fe131 c27a02706b9d43ffaab2c5fa833fec04 b33d27d5088343569f4459643d0da580 - - default default] Lock "4b2e5fac-6b68-407e-a326-34ec3b940ed7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:27.086 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:27 compute-0 nova_compute[192810]: 2025-09-30 21:34:27.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:28 compute-0 nova_compute[192810]: 2025-09-30 21:34:28.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:29 compute-0 nova_compute[192810]: 2025-09-30 21:34:29.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:32 compute-0 nova_compute[192810]: 2025-09-30 21:34:32.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:33 compute-0 sshd-session[234832]: Connection closed by 121.15.157.194 port 48930
Sep 30 21:34:33 compute-0 nova_compute[192810]: 2025-09-30 21:34:33.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:34 compute-0 podman[234833]: 2025-09-30 21:34:34.31538602 +0000 UTC m=+0.058123713 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:34:34 compute-0 podman[234834]: 2025-09-30 21:34:34.32194037 +0000 UTC m=+0.061195198 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, vcs-type=git)
Sep 30 21:34:34 compute-0 ovn_controller[94912]: 2025-09-30T21:34:34Z|00355|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:34:34 compute-0 nova_compute[192810]: 2025-09-30 21:34:34.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:35 compute-0 nova_compute[192810]: 2025-09-30 21:34:35.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:37 compute-0 nova_compute[192810]: 2025-09-30 21:34:37.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:38.739 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:38.740 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:38.740 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:38 compute-0 nova_compute[192810]: 2025-09-30 21:34:38.845 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268063.8440459, 4b2e5fac-6b68-407e-a326-34ec3b940ed7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:38 compute-0 nova_compute[192810]: 2025-09-30 21:34:38.845 2 INFO nova.compute.manager [-] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] VM Stopped (Lifecycle Event)
Sep 30 21:34:38 compute-0 nova_compute[192810]: 2025-09-30 21:34:38.875 2 DEBUG nova.compute.manager [None req-3a3156f1-7823-404f-b456-73e10bf5b48c - - - - - -] [instance: 4b2e5fac-6b68-407e-a326-34ec3b940ed7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:38 compute-0 nova_compute[192810]: 2025-09-30 21:34:38.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.247 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.247 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.271 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.380 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.381 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.386 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.387 2 INFO nova.compute.claims [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.480 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "09b1225c-028f-48a8-b366-03be5cdd4f19" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.480 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.495 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.530 2 DEBUG nova.compute.provider_tree [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.551 2 DEBUG nova.scheduler.client.report [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.583 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.584 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.621 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.622 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.627 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.627 2 INFO nova.compute.claims [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.646 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.646 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.674 2 INFO nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.701 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.788 2 DEBUG nova.compute.provider_tree [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.812 2 DEBUG nova.scheduler.client.report [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.849 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.850 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.850 2 INFO nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Creating image(s)
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.850 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.851 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.851 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.862 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.862 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.865 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.919 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.921 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.921 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.937 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.958 2 DEBUG nova.policy [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.967 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.980 2 INFO nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.992 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:40 compute-0 nova_compute[192810]: 2025-09-30 21:34:40.993 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.013 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.026 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.027 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.027 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.099 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.101 2 DEBUG nova.virt.disk.api [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Checking if we can resize image /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.101 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.155 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.158 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.158 2 INFO nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Creating image(s)
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.159 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.159 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.160 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.177 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.178 2 DEBUG nova.virt.disk.api [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Cannot resize image /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.178 2 DEBUG nova.objects.instance [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.180 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.211 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.212 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Ensure instance console log exists: /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.212 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.213 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.213 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.232 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.233 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.233 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.248 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.323 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.324 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 podman[234899]: 2025-09-30 21:34:41.329586394 +0000 UTC m=+0.053440649 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:34:41 compute-0 podman[234896]: 2025-09-30 21:34:41.329657225 +0000 UTC m=+0.057906397 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:34:41 compute-0 podman[234898]: 2025-09-30 21:34:41.329726357 +0000 UTC m=+0.056340899 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.365 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.366 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.367 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.421 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.422 2 DEBUG nova.virt.disk.api [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Checking if we can resize image /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.423 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.477 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.478 2 DEBUG nova.virt.disk.api [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Cannot resize image /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.479 2 DEBUG nova.objects.instance [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lazy-loading 'migration_context' on Instance uuid 09b1225c-028f-48a8-b366-03be5cdd4f19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.498 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.499 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Ensure instance console log exists: /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.499 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.499 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.500 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.501 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.506 2 WARNING nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.509 2 DEBUG nova.virt.libvirt.host [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.510 2 DEBUG nova.virt.libvirt.host [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.512 2 DEBUG nova.virt.libvirt.host [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.512 2 DEBUG nova.virt.libvirt.host [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.513 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.514 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.514 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.514 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.515 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.515 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.515 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.515 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.516 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.516 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.516 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.516 2 DEBUG nova.virt.hardware [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.519 2 DEBUG nova.objects.instance [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 09b1225c-028f-48a8-b366-03be5cdd4f19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.534 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <uuid>09b1225c-028f-48a8-b366-03be5cdd4f19</uuid>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <name>instance-00000063</name>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersAaction247Test-server-938531012</nova:name>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:34:41</nova:creationTime>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:user uuid="cced9253a90a431bbc6bc5439d274109">tempest-ServersAaction247Test-1229742565-project-member</nova:user>
Sep 30 21:34:41 compute-0 nova_compute[192810]:         <nova:project uuid="123dde9c899d4e498c5a20bf76fe32a4">tempest-ServersAaction247Test-1229742565</nova:project>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <system>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="serial">09b1225c-028f-48a8-b366-03be5cdd4f19</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="uuid">09b1225c-028f-48a8-b366-03be5cdd4f19</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </system>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <os>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </os>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <features>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </features>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.config"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/console.log" append="off"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <video>
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </video>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:34:41 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:34:41 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:34:41 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:34:41 compute-0 nova_compute[192810]: </domain>
Sep 30 21:34:41 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.592 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.593 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.593 2 INFO nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Using config drive
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.857 2 INFO nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Creating config drive at /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.config
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.862 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy6bm8fh8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.956 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Successfully created port: b0db3e99-94f0-42a8-a410-081e33fd0dee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:34:41 compute-0 nova_compute[192810]: 2025-09-30 21:34:41.985 2 DEBUG oslo_concurrency.processutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy6bm8fh8" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:42 compute-0 systemd-machined[152794]: New machine qemu-44-instance-00000063.
Sep 30 21:34:42 compute-0 systemd[1]: Started Virtual Machine qemu-44-instance-00000063.
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-0 ovn_controller[94912]: 2025-09-30T21:34:42Z|00356|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.820 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268082.8202903, 09b1225c-028f-48a8-b366-03be5cdd4f19 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.821 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] VM Resumed (Lifecycle Event)
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.823 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.823 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.826 2 INFO nova.virt.libvirt.driver [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance spawned successfully.
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.827 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.855 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.861 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.863 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.864 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.864 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.864 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.864 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.865 2 DEBUG nova.virt.libvirt.driver [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.897 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.897 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268082.8229547, 09b1225c-028f-48a8-b366-03be5cdd4f19 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.897 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] VM Started (Lifecycle Event)
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.919 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.921 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.942 2 INFO nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Took 1.79 seconds to spawn the instance on the hypervisor.
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.942 2 DEBUG nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:42 compute-0 nova_compute[192810]: 2025-09-30 21:34:42.947 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.022 2 INFO nova.compute.manager [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Took 2.44 seconds to build instance.
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.052 2 DEBUG oslo_concurrency.lockutils [None req-dcb9b94e-8f46-47ab-8fff-73f910025d4a cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.258 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Successfully updated port: b0db3e99-94f0-42a8-a410-081e33fd0dee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.286 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.286 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.287 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.389 2 DEBUG nova.compute.manager [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-changed-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.390 2 DEBUG nova.compute.manager [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Refreshing instance network info cache due to event network-changed-b0db3e99-94f0-42a8-a410-081e33fd0dee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.390 2 DEBUG oslo_concurrency.lockutils [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.459 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.908 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'name': 'tempest-ServersAaction247Test-server-938531012', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000063', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '123dde9c899d4e498c5a20bf76fe32a4', 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'hostId': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.910 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000005a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd876c85b6ca5418eb657e48391a6503b', 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'hostId': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.911 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.911 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>]
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:34:43 compute-0 nova_compute[192810]: 2025-09-30 21:34:43.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.949 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.requests volume: 689 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.950 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.968 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.requests volume: 1099 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.968 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cb67428-30da-4861-afd5-3fae94075a98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 689, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:43.911894', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47972864-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'cb4c91fec100992897da789a55ea785113d5db32754c85d27be8c9cea2f5a98d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:43.911894', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4797373c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'b1de3b08422c554c3fe925cb911045c45c868c2f22e72db6fc6b20e758588b1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1099, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:43.911894', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4799eb44-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '03880557fd9934075489fd64ad9f9e8d15c5e623b796d77a283a12714b1c5236'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:43.911894', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4799f99a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '1e76fd1c470d562f909e2869ac40f4e354e20a63c366d285fdbe514c75e72211'}]}, 'timestamp': '2025-09-30 21:34:43.969209', '_unique_id': 'ffddbefea5ce4b99b1da552575aec8c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.971 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.972 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.972 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.latency volume: 1492336707 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.972 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d7f4198-3bfc-442e-b088-34a1b2eb6b2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:43.971768', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '479a6a2e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': '37fb037c374868681793779589a0e9fb77d5a5dda18594cd8490760d4ac27d52'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:43.971768', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '479a73c0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': '1bf3b4b6d24cb712115e79c60a0a07091aa7fc54e1d8d6383404cfcc193d4f13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1492336707, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:43.971768', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '479a7cda-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': 'bae68794904d6320eb63649ae427147f560027a81463cd81d3510299a1ea5a58'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:43.971768', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '479a86ee-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': 'b657cf1f79ebbb103bf22fd7d4fff0faf4a4ba3850fc4e7fdd3b601b4290e0b2'}]}, 'timestamp': '2025-09-30 21:34:43.972802', '_unique_id': '1d8e40dfd7c947deaa53ceebdb4ed151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.977 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f92e449c-90c2-4cba-a8c1-ba6b1c82d770 / tape2cc5531-35 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.978 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '033c5c44-9d99-4c59-a429-f4fb7e9f39e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:43.974732', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '479b6ae6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': 'ca2b1511da60061fdc852d9b7513a43777ad9ea0728c48911d7d42cf353daac7'}]}, 'timestamp': '2025-09-30 21:34:43.978746', '_unique_id': '5f3f578201c8494ebf2d7f37c3fc8a03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.980 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:34:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:43.993 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/cpu volume: 1120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.008 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/cpu volume: 11480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '825be93d-a81f-4fbb-a261-a7bc3182e175', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1120000000, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'timestamp': '2025-09-30T21:34:43.980710', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '479dc6d8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.680992866, 'message_signature': 'd4a70fd657c6d6e7a9774274dadcb70a5be1259f0ae239b5e53012610ec3e78c'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11480000000, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'timestamp': '2025-09-30T21:34:43.980710', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '47a004de-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.695614684, 'message_signature': '2ce82992824b467931eaa892f4dfcde0d087dbf24ea6c2ec2ef963c3cb039700'}]}, 'timestamp': '2025-09-30 21:34:44.008894', '_unique_id': 'c260fae3d872409c92fd59e798481060'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.009 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.011 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.011 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>]
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.011 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.012 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.012 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 09b1225c-028f-48a8-b366-03be5cdd4f19: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.012 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/memory.usage volume: 46.65234375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dbc8249-82bf-4598-90e0-86bb34124984', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.65234375, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'timestamp': '2025-09-30T21:34:44.011979', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '47a0950c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.695614684, 'message_signature': 'b622a27bfb9bd505800b08cddba001004a99aafd4b2ba7368df38869df7a0802'}]}, 'timestamp': '2025-09-30 21:34:44.012492', '_unique_id': '03b7354e389b4b3f851cbf3d88c7ef7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.013 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11b9a4e0-fc46-4a46-8f1d-22348423986a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.014175', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a0e232-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '226dba426207c9e52d84c8af598697b4f88b9ba81343f026765126de141caa4b'}]}, 'timestamp': '2025-09-30 21:34:44.014460', '_unique_id': '110bf5c0153b4a4a861da0bc06943263'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.014 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.015 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.015 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.016 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>]
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.016 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.016 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd173b104-bb5c-4ddd-a7f7-b37470a880f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.016359', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a1370a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '9bbe8b805605ae06f126213a0c3876b227f9cb2d0f6922d690b3f21b1ca417ac'}]}, 'timestamp': '2025-09-30 21:34:44.016646', '_unique_id': '21e2372028a64a47934e75c8a136ae70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.017 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1b7257d-8048-47dd-945b-a62918ec217e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.018102', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a17b66-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '0c66c39d37d422cbb13a8149b089a635fcafab720af5b6259324f61a14f44efa'}]}, 'timestamp': '2025-09-30 21:34:44.018376', '_unique_id': '4e0ad0ab88614c8e992ce32147e0c771'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.018 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.019 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.027 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.028 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.035 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.036 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ca26fa3-52ff-43e3-b9d0-96cfd69aa1f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.019806', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a2fe46-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': 'ec47c522005540b7f3657dd4f3a27301c181596fcd5c760387f099d75e858358'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.019806', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a30bb6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': '9085b19132e3c14450e7747735882cd200e8c873932ae30258b0e5ed5f222ffa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.019806', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a438ce-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': 'ca68cde374ca31157bab687e849c8e81884ffb872ee2b3bd5bd7d08b8eba9da7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.019806', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a443a0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': 'a83dd3f446c4992153057eaac2c40a1e2feaa4a813b8a105de92f6b7ab493b53'}]}, 'timestamp': '2025-09-30 21:34:44.036625', '_unique_id': '15256bedc42c453faed5bffa538b5aa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.037 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7803c655-3dd4-4146-a84b-8fd8a01f51c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.039117', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a4b0b0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '21fb77bd3bf70e42292602eba761e039d5ecfd54775b4805a230984a2f22eae9'}]}, 'timestamp': '2025-09-30 21:34:44.039403', '_unique_id': '5bad4f1112f44a87861affe95ef47aed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.039 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.040 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.040 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.bytes volume: 21485056 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.041 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.041 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.bytes volume: 30501376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.041 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '176d96ce-0e5a-474a-8d08-7edc30982529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21485056, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.040903', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a4f598-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'e169c2227a43f0c6a460ba488e2f4e0700de3d2624151c1d11c026d466eb5141'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.040903', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a4ff8e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'b9c39dbc5d5a11f25379760639031b855e065683a821e099017c5bb4b42d1ee7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30501376, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.040903', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a5086c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': 'a6e45021b3d00ef041dd07e73a6328e66e64949717c84fe0b290a6164199b721'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.040903', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a51208-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '4f223e5f1961d46c8e16e8e8781c0d99064a77acc032c7c0c5acb3abcb2b49f1'}]}, 'timestamp': '2025-09-30 21:34:44.041875', '_unique_id': 'f3da3c5fc448413fae2559827aaad4fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.042 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.043 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.043 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.incoming.bytes volume: 1736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fb722b4-6a06-4a2d-b33d-0604a7cdf8bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1736, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.043344', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a5551a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': 'f2bce5da2047746ab1a10a60687ee125f8024cd1a9169b9dcdd9a1a913b9a75d'}]}, 'timestamp': '2025-09-30 21:34:44.043622', '_unique_id': '5adec155d29f43488b849419e5cbe815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.045 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.045 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.045 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.bytes volume: 73015296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.045 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '851f3502-9282-405c-adb5-a7f4e5d58bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.044999', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a59584-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'f86a092e0d3035c5a45a75338c33939e5ed5da8d9090440204b54c9ba1654fd3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.044999', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a59e8a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'fe7c303f41e9b589cf57047e9db801742662fac9698c0f90cf441b8d6aab7874'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73015296, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.044999', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a5a7f4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '8b96eb62ca35d111d94e19c864a7b6f3444852f4ad69ce2c3e77103024a83988'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.044999', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a5b0be-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': 'ab8e5d1c7fc9ab0377f2a854cf79743dbb993c52b80544d709fadcd4822d1e1d'}]}, 'timestamp': '2025-09-30 21:34:44.045939', '_unique_id': '59e931bbaa0948a9813498f6a523d626'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.046 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.047 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.047 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.latency volume: 413255380 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.047 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.read.latency volume: 2195283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.047 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.latency volume: 506238688 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.read.latency volume: 101739007 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4e5d55a-0606-4ebb-8393-1248ebe431ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 413255380, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.047496', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a5f7c2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': '8ade1e5f35cace355b85b5eb216c89e28b5df06d54c223be7b0e7e9344bc71d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2195283, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.047496', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a600f0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': '70a9c43dbf9d99b7cf14846a81c8ba597014f1a9707663744eb96aac56c8503c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 506238688, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.047496', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a609b0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '30a551322400fca724e7293ddb948fa7f78dee213a88a0c61a281210321bc733'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101739007, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.047496', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a61248-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '09c574d03c144c6f73a6f033fe993082ed92a07083956e4c34e6cb99b256735c'}]}, 'timestamp': '2025-09-30 21:34:44.048435', '_unique_id': '0e69269749444fbf9726f1c7c003b06b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.048 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.049 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.049 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.050 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.050 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.requests volume: 338 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.050 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e8fc386-63b1-478c-9c6e-3165b26918b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.049932', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a65776-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': 'bc82ba0b74c3885f94f5390edfaf209137bede9bedb814094f666b3eb8f43fcd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.049932', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a6664e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.599386479, 'message_signature': '13ae24ff1a217186611a4a26abe652f64911faa362854eb3db361aab8863cdec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 338, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.049932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a66fe0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': '78690713637dbfc484342576f57e5a00d99237e4c59cd1b65ec105932e4eaece'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.049932', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a67878-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.638586408, 'message_signature': 'dd42c0df60243f2002db33495c19c6b53faeec03b0b7898941d87969eb7965d1'}]}, 'timestamp': '2025-09-30 21:34:44.051051', '_unique_id': 'dfdc39cf876f42edba5be08aa5cf2565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.051 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.052 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcbae660-ab19-447b-b0e1-44da0b23738f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.052562', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a6bd7e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': 'ca2b201232d7726a3f7b301a18690a037e347dfd5ace7685774583765eced7dd'}]}, 'timestamp': '2025-09-30 21:34:44.052833', '_unique_id': '2d6f4722195d4564afbad63c6b01566e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServersAaction247Test-server-938531012>, <NovaLikeServer: tempest-ServerActionsTestOtherB-server-1093237178>]
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.054 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.055 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.055 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b54c050-151f-4de8-b9b7-cea3ad394afe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.054700', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a7108a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': '54f632e9a599b55acaa0325c805c591a8807bdf81cbbba4c7ae51c49857ecc48'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.054700', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a719c2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': 'f27c185b62dadb353c7df7cda91079c4a310abc00aa090a5d995b9f0691c2c8b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.054700', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a728cc-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': '2a2b95295ff81d332d80635a1870c67c597fc32b86813c64778f004b15c551f7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.054700', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a73254-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': 'b4ed5b173a33a9d8c42f1ce3f021c8d039e42f53f72e07d179132c7bab034ff5'}]}, 'timestamp': '2025-09-30 21:34:44.055810', '_unique_id': '3a42e77e922e46c9923e3d6758b407cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.057 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.057 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d1c5c33-7234-48b4-b465-adc93d31642d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.057455', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a77d9a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': 'b49c9c57d0d9ad020ca61f6f0f9fa6e0e23c575af971e77f287bdc2c7b17beb6'}]}, 'timestamp': '2025-09-30 21:34:44.057751', '_unique_id': '318a62e8e5994250b5b1bbaf53498105'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.059 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.059 12 DEBUG ceilometer.compute.pollsters [-] 09b1225c-028f-48a8-b366-03be5cdd4f19/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.059 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.059 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47b98d4b-4b6b-494b-a849-ec0d062fb5ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-vda', 'timestamp': '2025-09-30T21:34:44.059199', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a7c03e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': '4bab53c28f5cd3c15c6d32d7f38ff3b5985db1f47a02154ef01ef9b933657e8f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'cced9253a90a431bbc6bc5439d274109', 'user_name': None, 'project_id': '123dde9c899d4e498c5a20bf76fe32a4', 'project_name': None, 'resource_id': '09b1225c-028f-48a8-b366-03be5cdd4f19-sda', 'timestamp': '2025-09-30T21:34:44.059199', 'resource_metadata': {'display_name': 'tempest-ServersAaction247Test-server-938531012', 'name': 'instance-00000063', 'instance_id': '09b1225c-028f-48a8-b366-03be5cdd4f19', 'instance_type': 'm1.nano', 'host': '52e7d4e05f4c9504977a8d66267b8313b45589c11eee6d0f11b29a3a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a7c9f8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.7073105, 'message_signature': '0a75d86328c54043734a21b013de73f0b99ebe1316d7cbfebeca2420d8a33791'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-vda', 'timestamp': '2025-09-30T21:34:44.059199', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47a7d2ea-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': '4e39fe595f05727afee36b529fa94a97306ed3dfba02412360ae5bf80345e18b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770-sda', 'timestamp': '2025-09-30T21:34:44.059199', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'instance-0000005a', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47a7db78-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.716120285, 'message_signature': '11d3eadb954dadec5786526dde661d88471b2dad6a1e1213ff5caa9427e6ec3a'}]}, 'timestamp': '2025-09-30 21:34:44.060144', '_unique_id': 'f5187c508a514dda98e6212b88c7a2c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.060 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.061 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.061 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcf84b9f-ff03-4c5e-98e4-e7c72fbb40c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.061659', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a82074-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '7355c5827add6529dba3e49a4a5c6207c4a1768aff703f0766792b99198275f5'}]}, 'timestamp': '2025-09-30 21:34:44.061919', '_unique_id': '62e79963e90b443caa5348832df4e604'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.063 12 DEBUG ceilometer.compute.pollsters [-] f92e449c-90c2-4cba-a8c1-ba6b1c82d770/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c204d0cd-9878-40a3-add2-e7e2fe3182d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b9b3e9f2523944539f57a1ff5d565cb4', 'user_name': None, 'project_id': 'd876c85b6ca5418eb657e48391a6503b', 'project_name': None, 'resource_id': 'instance-0000005a-f92e449c-90c2-4cba-a8c1-ba6b1c82d770-tape2cc5531-35', 'timestamp': '2025-09-30T21:34:44.063331', 'resource_metadata': {'display_name': 'tempest-ServerActionsTestOtherB-server-1093237178', 'name': 'tape2cc5531-35', 'instance_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'instance_type': 'm1.nano', 'host': 'f125e72f2a4296991d4d9d11243ea6408a2a2d8d9fbf13e48a5607bf', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:a2:bc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape2cc5531-35'}, 'message_id': '47a861b0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4779.663691293, 'message_signature': '09bf4fbe4ccab44fa89a19bd73c00996c4ac62a1735245552d6ce58d4be08cea'}]}, 'timestamp': '2025-09-30 21:34:44.063604', '_unique_id': '9f2a807925bd4941ab759a5241a0d63d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:34:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:34:44.064 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.963 2 DEBUG nova.network.neutron [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.998 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.998 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance network_info: |[{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.999 2 DEBUG oslo_concurrency.lockutils [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:44 compute-0 nova_compute[192810]: 2025-09-30 21:34:44.999 2 DEBUG nova.network.neutron [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Refreshing network info cache for port b0db3e99-94f0-42a8-a410-081e33fd0dee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.002 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start _get_guest_xml network_info=[{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.006 2 WARNING nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.011 2 DEBUG nova.virt.libvirt.host [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.012 2 DEBUG nova.virt.libvirt.host [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.020 2 DEBUG nova.virt.libvirt.host [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.021 2 DEBUG nova.virt.libvirt.host [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.022 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.022 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.022 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.023 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.023 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.023 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.023 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.023 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.024 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.024 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.024 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.024 2 DEBUG nova.virt.hardware [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.027 2 DEBUG nova.virt.libvirt.vif [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-475541027',display_name='tempest-ServerStableDeviceRescueTest-server-475541027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-475541027',id=98,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-1jbnwm8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:40Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=4255e358-6db2-4947-a3d6-4e045f9235fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.027 2 DEBUG nova.network.os_vif_util [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.028 2 DEBUG nova.network.os_vif_util [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.028 2 DEBUG nova.objects.instance [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.044 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <uuid>4255e358-6db2-4947-a3d6-4e045f9235fb</uuid>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <name>instance-00000062</name>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-475541027</nova:name>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:34:45</nova:creationTime>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         <nova:port uuid="b0db3e99-94f0-42a8-a410-081e33fd0dee">
Sep 30 21:34:45 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <system>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="serial">4255e358-6db2-4947-a3d6-4e045f9235fb</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="uuid">4255e358-6db2-4947-a3d6-4e045f9235fb</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </system>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <os>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </os>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <features>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </features>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:03:b5:6d"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <target dev="tapb0db3e99-94"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/console.log" append="off"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <video>
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </video>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:34:45 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:34:45 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:34:45 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:34:45 compute-0 nova_compute[192810]: </domain>
Sep 30 21:34:45 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.045 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Preparing to wait for external event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.045 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.045 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.045 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.046 2 DEBUG nova.virt.libvirt.vif [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-475541027',display_name='tempest-ServerStableDeviceRescueTest-server-475541027',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-475541027',id=98,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-1jbnwm8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:40Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=4255e358-6db2-4947-a3d6-4e045f9235fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.046 2 DEBUG nova.network.os_vif_util [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.047 2 DEBUG nova.network.os_vif_util [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.047 2 DEBUG os_vif [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.048 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.048 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.051 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0db3e99-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.052 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb0db3e99-94, col_values=(('external_ids', {'iface-id': 'b0db3e99-94f0-42a8-a410-081e33fd0dee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:b5:6d', 'vm-uuid': '4255e358-6db2-4947-a3d6-4e045f9235fb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 NetworkManager[51733]: <info>  [1759268085.0546] manager: (tapb0db3e99-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.059 2 INFO os_vif [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94')
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.131 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.131 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.131 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:03:b5:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.132 2 INFO nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Using config drive
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.151 2 DEBUG nova.compute.manager [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.226 2 INFO nova.compute.manager [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] instance snapshotting
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.227 2 DEBUG nova.objects.instance [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lazy-loading 'flavor' on Instance uuid 09b1225c-028f-48a8-b366-03be5cdd4f19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.412 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "09b1225c-028f-48a8-b366-03be5cdd4f19" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.413 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.413 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "09b1225c-028f-48a8-b366-03be5cdd4f19-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.413 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.413 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.422 2 INFO nova.compute.manager [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Terminating instance
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.431 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "refresh_cache-09b1225c-028f-48a8-b366-03be5cdd4f19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.432 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquired lock "refresh_cache-09b1225c-028f-48a8-b366-03be5cdd4f19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.432 2 DEBUG nova.network.neutron [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.567 2 INFO nova.virt.libvirt.driver [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Beginning live snapshot process
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.621 2 DEBUG nova.compute.manager [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance disappeared during snapshot _snapshot_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:4390
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.651 2 DEBUG nova.network.neutron [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.656 2 INFO nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Creating config drive at /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.662 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4y8xbhv6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.786 2 DEBUG oslo_concurrency.processutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4y8xbhv6" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:45 compute-0 kernel: tapb0db3e99-94: entered promiscuous mode
Sep 30 21:34:45 compute-0 ovn_controller[94912]: 2025-09-30T21:34:45Z|00357|binding|INFO|Claiming lport b0db3e99-94f0-42a8-a410-081e33fd0dee for this chassis.
Sep 30 21:34:45 compute-0 ovn_controller[94912]: 2025-09-30T21:34:45Z|00358|binding|INFO|b0db3e99-94f0-42a8-a410-081e33fd0dee: Claiming fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 NetworkManager[51733]: <info>  [1759268085.8434] manager: (tapb0db3e99-94): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.855 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.855 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.857 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:34:45 compute-0 ovn_controller[94912]: 2025-09-30T21:34:45Z|00359|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee ovn-installed in OVS
Sep 30 21:34:45 compute-0 ovn_controller[94912]: 2025-09-30T21:34:45Z|00360|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee up in Southbound
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 nova_compute[192810]: 2025-09-30 21:34:45.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.867 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7f115f-6c70-4f03-8bc1-fd8223786dcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.868 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.870 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.870 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c461ac6f-62c0-42aa-8e2f-3de0baa4bd2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.871 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[12f560cb-cbeb-443e-a0c3-6087fd7c37fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.882 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[dede98e3-2e39-4427-8531-8b20526c242f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 systemd-machined[152794]: New machine qemu-45-instance-00000062.
Sep 30 21:34:45 compute-0 systemd[1]: Started Virtual Machine qemu-45-instance-00000062.
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.907 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9666d2-5570-4809-a9cd-74d830029e3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 systemd-udevd[235019]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:34:45 compute-0 NetworkManager[51733]: <info>  [1759268085.9217] device (tapb0db3e99-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:34:45 compute-0 NetworkManager[51733]: <info>  [1759268085.9228] device (tapb0db3e99-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.938 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[62764670-90b4-46e4-a55b-f173d223c43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 NetworkManager[51733]: <info>  [1759268085.9467] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/170)
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.948 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ab86d9e-6a30-4102-9714-52182b14d8d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.976 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c6351d78-f21e-4e40-b3bf-9d13d08cb7f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:45.979 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a41caa06-ff29-487a-9b8d-b50813561734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 NetworkManager[51733]: <info>  [1759268086.0029] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.006 2 DEBUG nova.network.neutron [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.008 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ce4d45-fc4e-428e-a05c-ff4dc6ee83ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.023 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb04a76-d841-466c-a6a0-3b47a188e3ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478162, 'reachable_time': 16849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235049, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.025 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Releasing lock "refresh_cache-09b1225c-028f-48a8-b366-03be5cdd4f19" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.025 2 DEBUG nova.compute.manager [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.037 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5636e3c9-89a8-47cc-ae1e-a892cf156d8d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 478162, 'tstamp': 478162}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235050, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.055 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[46dd2e19-ff35-4394-8551-a975ee972118]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478162, 'reachable_time': 16849, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235051, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000063.scope: Deactivated successfully.
Sep 30 21:34:46 compute-0 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000063.scope: Consumed 3.930s CPU time.
Sep 30 21:34:46 compute-0 systemd-machined[152794]: Machine qemu-44-instance-00000063 terminated.
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.086 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1713b8-df46-4c67-8420-e637115136ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.149 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[16db4995-f1be-4a0c-b708-f30869a4c541]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.150 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.150 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.151 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:46 compute-0 NetworkManager[51733]: <info>  [1759268086.1541] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Sep 30 21:34:46 compute-0 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.156 2 DEBUG nova.compute.manager [req-f24b837c-54fd-41d6-9998-03a32a3ad0b1 req-74dc3311-cd54-4c45-a23a-a33b02f601e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.157 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.157 2 DEBUG oslo_concurrency.lockutils [req-f24b837c-54fd-41d6-9998-03a32a3ad0b1 req-74dc3311-cd54-4c45-a23a-a33b02f601e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:46 compute-0 ovn_controller[94912]: 2025-09-30T21:34:46Z|00361|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.158 2 DEBUG oslo_concurrency.lockutils [req-f24b837c-54fd-41d6-9998-03a32a3ad0b1 req-74dc3311-cd54-4c45-a23a-a33b02f601e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.158 2 DEBUG oslo_concurrency.lockutils [req-f24b837c-54fd-41d6-9998-03a32a3ad0b1 req-74dc3311-cd54-4c45-a23a-a33b02f601e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.159 2 DEBUG nova.compute.manager [req-f24b837c-54fd-41d6-9998-03a32a3ad0b1 req-74dc3311-cd54-4c45-a23a-a33b02f601e3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Processing event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.159 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.163 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7c92423b-8559-4660-b317-a66f4c73a543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.164 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:34:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:34:46.165 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.268 2 INFO nova.virt.libvirt.driver [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance destroyed successfully.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.269 2 DEBUG nova.objects.instance [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lazy-loading 'resources' on Instance uuid 09b1225c-028f-48a8-b366-03be5cdd4f19 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.283 2 INFO nova.virt.libvirt.driver [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Deleting instance files /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19_del
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.285 2 INFO nova.virt.libvirt.driver [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Deletion of /var/lib/nova/instances/09b1225c-028f-48a8-b366-03be5cdd4f19_del complete
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.373 2 INFO nova.compute.manager [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Took 0.35 seconds to destroy the instance on the hypervisor.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.374 2 DEBUG oslo.service.loopingcall [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.375 2 DEBUG nova.compute.manager [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.375 2 DEBUG nova.network.neutron [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:34:46 compute-0 podman[235098]: 2025-09-30 21:34:46.528060401 +0000 UTC m=+0.050568059 container create 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.570 2 DEBUG nova.network.neutron [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:34:46 compute-0 systemd[1]: Started libpod-conmon-3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16.scope.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.578 2 DEBUG nova.network.neutron [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updated VIF entry in instance network info cache for port b0db3e99-94f0-42a8-a410-081e33fd0dee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.578 2 DEBUG nova.network.neutron [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.583 2 DEBUG nova.network.neutron [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.592 2 DEBUG oslo_concurrency.lockutils [req-7f997027-7ff3-4fea-ae79-79874d0fb95e req-7fead06f-3f44-4bf0-8d41-7f6c14add399 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:46 compute-0 podman[235098]: 2025-09-30 21:34:46.498446336 +0000 UTC m=+0.020954024 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:34:46 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.605 2 INFO nova.compute.manager [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Took 0.23 seconds to deallocate network for instance.
Sep 30 21:34:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f1af6fdcbb33ed59232f7eeccb1a282eefef0893600eb8ec7beb7677de50dc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.609 2 DEBUG nova.compute.manager [None req-79caa997-127a-4c5d-9d41-3f53d1036697 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Found 0 images (rotation: 2) _rotate_backups /usr/lib/python3.9/site-packages/nova/compute/manager.py:4450
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.620 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268086.6193326, 4255e358-6db2-4947-a3d6-4e045f9235fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.620 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Started (Lifecycle Event)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.622 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:34:46 compute-0 podman[235098]: 2025-09-30 21:34:46.624839279 +0000 UTC m=+0.147346967 container init 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.627 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.630 2 INFO nova.virt.libvirt.driver [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance spawned successfully.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.630 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:34:46 compute-0 podman[235098]: 2025-09-30 21:34:46.630907107 +0000 UTC m=+0.153414745 container start 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.643 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.645 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:46 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [NOTICE]   (235117) : New worker (235119) forked
Sep 30 21:34:46 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [NOTICE]   (235117) : Loading success.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.665 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.665 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268086.6198277, 4255e358-6db2-4947-a3d6-4e045f9235fb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.666 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Paused (Lifecycle Event)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.669 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.669 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.670 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.670 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.671 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.671 2 DEBUG nova.virt.libvirt.driver [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.701 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.704 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268086.6266935, 4255e358-6db2-4947-a3d6-4e045f9235fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.704 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Resumed (Lifecycle Event)
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.735 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.736 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.739 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.745 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.768 2 INFO nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Took 5.92 seconds to spawn the instance on the hypervisor.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.768 2 DEBUG nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.769 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.823 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.854 2 INFO nova.compute.manager [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Took 6.52 seconds to build instance.
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.859 2 DEBUG nova.compute.provider_tree [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.870 2 DEBUG oslo_concurrency.lockutils [None req-5c06b988-a3a8-4623-a57e-0c45d2e7c5ad 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.871 2 DEBUG nova.scheduler.client.report [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.888 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.919 2 INFO nova.scheduler.client.report [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Deleted allocations for instance 09b1225c-028f-48a8-b366-03be5cdd4f19
Sep 30 21:34:46 compute-0 nova_compute[192810]: 2025-09-30 21:34:46.998 2 DEBUG oslo_concurrency.lockutils [None req-dd55377f-6fdb-4809-a445-c7ba9a4ca9e8 cced9253a90a431bbc6bc5439d274109 123dde9c899d4e498c5a20bf76fe32a4 - - default default] Lock "09b1225c-028f-48a8-b366-03be5cdd4f19" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:47 compute-0 nova_compute[192810]: 2025-09-30 21:34:47.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.238 2 DEBUG nova.compute.manager [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.238 2 DEBUG oslo_concurrency.lockutils [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.239 2 DEBUG oslo_concurrency.lockutils [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.239 2 DEBUG oslo_concurrency.lockutils [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.239 2 DEBUG nova.compute.manager [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.239 2 WARNING nova.compute.manager [req-62844294-a289-456f-95d9-705f2f4af2da req-68838dde-7714-4891-9e4a-25b1de66326d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state None.
Sep 30 21:34:48 compute-0 nova_compute[192810]: 2025-09-30 21:34:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:49 compute-0 nova_compute[192810]: 2025-09-30 21:34:49.463 2 DEBUG nova.compute.manager [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:34:49 compute-0 nova_compute[192810]: 2025-09-30 21:34:49.524 2 INFO nova.compute.manager [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] instance snapshotting
Sep 30 21:34:49 compute-0 nova_compute[192810]: 2025-09-30 21:34:49.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:49 compute-0 nova_compute[192810]: 2025-09-30 21:34:49.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:49 compute-0 nova_compute[192810]: 2025-09-30 21:34:49.896 2 INFO nova.virt.libvirt.driver [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Beginning live snapshot process
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:50 compute-0 virtqemud[192233]: invalid argument: disk vda does not have an active block job
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.090 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.174 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json -f qcow2" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.175 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.238 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json -f qcow2" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.256 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.310 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.311 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.350 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a.delta 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.351 2 INFO nova.virt.libvirt.driver [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.401 2 DEBUG nova.virt.libvirt.guest [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.905 2 DEBUG nova.virt.libvirt.guest [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.908 2 INFO nova.virt.libvirt.driver [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.945 2 DEBUG nova.privsep.utils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:34:50 compute-0 nova_compute[192810]: 2025-09-30 21:34:50.946 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a.delta /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:51 compute-0 nova_compute[192810]: 2025-09-30 21:34:51.104 2 DEBUG oslo_concurrency.processutils [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a.delta /var/lib/nova/instances/snapshots/tmpq5ykmasf/815c46d6a81a4cf19ccc695ef16fe75a" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:51 compute-0 nova_compute[192810]: 2025-09-30 21:34:51.105 2 INFO nova.virt.libvirt.driver [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Snapshot extracted, beginning image upload
Sep 30 21:34:51 compute-0 nova_compute[192810]: 2025-09-30 21:34:51.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:51 compute-0 nova_compute[192810]: 2025-09-30 21:34:51.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:34:52 compute-0 nova_compute[192810]: 2025-09-30 21:34:52.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:53 compute-0 podman[235154]: 2025-09-30 21:34:53.35977363 +0000 UTC m=+0.092828012 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:34:53 compute-0 podman[235155]: 2025-09-30 21:34:53.3744743 +0000 UTC m=+0.095194430 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.455 2 INFO nova.virt.libvirt.driver [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Snapshot image upload complete
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.456 2 INFO nova.compute.manager [None req-981e3d4e-bb68-478f-a1c0-3f2203f30409 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Took 3.92 seconds to snapshot the instance on the hypervisor.
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.812 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.813 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.882 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.940 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.941 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.993 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:53 compute-0 nova_compute[192810]: 2025-09-30 21:34:53.998 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.051 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.052 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.108 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.270 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.271 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5395MB free_disk=73.28899383544922GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.272 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.272 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:34:54 compute-0 podman[235212]: 2025-09-30 21:34:54.31650386 +0000 UTC m=+0.054713740 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance f92e449c-90c2-4cba-a8c1-ba6b1c82d770 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 4255e358-6db2-4947-a3d6-4e045f9235fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.556 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.572 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.596 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.596 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:34:54 compute-0 nova_compute[192810]: 2025-09-30 21:34:54.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.597 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.598 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.598 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.875 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.877 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.877 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:34:55 compute-0 nova_compute[192810]: 2025-09-30 21:34:55.878 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid f92e449c-90c2-4cba-a8c1-ba6b1c82d770 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:34:56 compute-0 nova_compute[192810]: 2025-09-30 21:34:56.745 2 INFO nova.compute.manager [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Rescuing
Sep 30 21:34:56 compute-0 nova_compute[192810]: 2025-09-30 21:34:56.746 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:34:56 compute-0 nova_compute[192810]: 2025-09-30 21:34:56.746 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:34:56 compute-0 nova_compute[192810]: 2025-09-30 21:34:56.746 2 DEBUG nova.network.neutron [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:34:57 compute-0 nova_compute[192810]: 2025-09-30 21:34:57.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:58 compute-0 nova_compute[192810]: 2025-09-30 21:34:58.952 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updating instance_info_cache with network_info: [{"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:58 compute-0 nova_compute[192810]: 2025-09-30 21:34:58.989 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-f92e449c-90c2-4cba-a8c1-ba6b1c82d770" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:34:58 compute-0 nova_compute[192810]: 2025-09-30 21:34:58.990 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:34:58 compute-0 nova_compute[192810]: 2025-09-30 21:34:58.990 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:58 compute-0 nova_compute[192810]: 2025-09-30 21:34:58.990 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:59 compute-0 nova_compute[192810]: 2025-09-30 21:34:59.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:34:59 compute-0 ovn_controller[94912]: 2025-09-30T21:34:59Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:34:59 compute-0 ovn_controller[94912]: 2025-09-30T21:34:59Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:34:59 compute-0 nova_compute[192810]: 2025-09-30 21:34:59.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:34:59 compute-0 nova_compute[192810]: 2025-09-30 21:34:59.878 2 DEBUG nova.network.neutron [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:34:59 compute-0 nova_compute[192810]: 2025-09-30 21:34:59.902 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:00 compute-0 nova_compute[192810]: 2025-09-30 21:35:00.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:00 compute-0 nova_compute[192810]: 2025-09-30 21:35:00.261 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:35:01 compute-0 nova_compute[192810]: 2025-09-30 21:35:01.265 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268086.2647495, 09b1225c-028f-48a8-b366-03be5cdd4f19 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:01 compute-0 nova_compute[192810]: 2025-09-30 21:35:01.266 2 INFO nova.compute.manager [-] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] VM Stopped (Lifecycle Event)
Sep 30 21:35:01 compute-0 nova_compute[192810]: 2025-09-30 21:35:01.284 2 DEBUG nova.compute.manager [None req-ce8f25fb-6438-40c1-a85a-50bb7c25fc29 - - - - - -] [instance: 09b1225c-028f-48a8-b366-03be5cdd4f19] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:02 compute-0 kernel: tapb0db3e99-94 (unregistering): left promiscuous mode
Sep 30 21:35:02 compute-0 NetworkManager[51733]: <info>  [1759268102.4110] device (tapb0db3e99-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 ovn_controller[94912]: 2025-09-30T21:35:02Z|00362|binding|INFO|Releasing lport b0db3e99-94f0-42a8-a410-081e33fd0dee from this chassis (sb_readonly=0)
Sep 30 21:35:02 compute-0 ovn_controller[94912]: 2025-09-30T21:35:02Z|00363|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee down in Southbound
Sep 30 21:35:02 compute-0 ovn_controller[94912]: 2025-09-30T21:35:02Z|00364|binding|INFO|Removing iface tapb0db3e99-94 ovn-installed in OVS
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.433 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.436 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.438 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.440 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[73c095bd-a642-4252-9802-a17b25ae74de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.440 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:35:02 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000062.scope: Deactivated successfully.
Sep 30 21:35:02 compute-0 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000062.scope: Consumed 12.598s CPU time.
Sep 30 21:35:02 compute-0 systemd-machined[152794]: Machine qemu-45-instance-00000062 terminated.
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [NOTICE]   (235117) : haproxy version is 2.8.14-c23fe91
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [NOTICE]   (235117) : path to executable is /usr/sbin/haproxy
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [WARNING]  (235117) : Exiting Master process...
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [WARNING]  (235117) : Exiting Master process...
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [ALERT]    (235117) : Current worker (235119) exited with code 143 (Terminated)
Sep 30 21:35:02 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235113]: [WARNING]  (235117) : All workers exited. Exiting... (0)
Sep 30 21:35:02 compute-0 systemd[1]: libpod-3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16.scope: Deactivated successfully.
Sep 30 21:35:02 compute-0 podman[235274]: 2025-09-30 21:35:02.56585253 +0000 UTC m=+0.047039731 container died 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16-userdata-shm.mount: Deactivated successfully.
Sep 30 21:35:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-6f1af6fdcbb33ed59232f7eeccb1a282eefef0893600eb8ec7beb7677de50dc8-merged.mount: Deactivated successfully.
Sep 30 21:35:02 compute-0 podman[235274]: 2025-09-30 21:35:02.607653763 +0000 UTC m=+0.088840984 container cleanup 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:35:02 compute-0 systemd[1]: libpod-conmon-3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16.scope: Deactivated successfully.
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 podman[235305]: 2025-09-30 21:35:02.663279354 +0000 UTC m=+0.037515779 container remove 3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.667 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5faa41bf-d675-4ad2-b2ac-45f07723c319]: (4, ('Tue Sep 30 09:35:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16)\n3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16\nTue Sep 30 09:35:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16)\n3d2965743fefd64d2e07a09950fe643862eb570be843fee5806dd30a9e57ca16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.669 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6da1b905-52f2-4fb7-a2ac-cf86233811ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.669 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:35:02 compute-0 nova_compute[192810]: 2025-09-30 21:35:02.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.687 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7e140c8e-9194-4e42-8927-0450e531e573]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.719 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a8787678-bd43-4705-9dbf-c34228cd4595]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.720 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f77ae4-4e76-49cb-b1a5-121762359bf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.733 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d529208e-edd2-4707-80a2-1c7c953b0fed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 478155, 'reachable_time': 27249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235339, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.735 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:35:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:02.735 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a9678193-7bc2-414d-8861-56c2e5864fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:02 compute-0 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.139 2 DEBUG nova.compute.manager [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.140 2 DEBUG oslo_concurrency.lockutils [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.140 2 DEBUG oslo_concurrency.lockutils [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.140 2 DEBUG oslo_concurrency.lockutils [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.140 2 DEBUG nova.compute.manager [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.140 2 WARNING nova.compute.manager [req-6f95c9ba-acad-47b1-876e-8a8cbb61605f req-fd6d32af-7f2b-4e08-80b8-0c267ad5dd3e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state rescuing.
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.276 2 INFO nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance shutdown successfully after 3 seconds.
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.280 2 INFO nova.virt.libvirt.driver [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance destroyed successfully.
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.280 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.301 2 INFO nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Attempting a stable device rescue
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.565 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.570 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.571 2 INFO nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Creating image(s)
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.573 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.573 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.575 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.575 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.591 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "6b8d62627514116d042a4973bfd575f038df9d61" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:03 compute-0 nova_compute[192810]: 2025-09-30 21:35:03.592 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "6b8d62627514116d042a4973bfd575f038df9d61" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:04 compute-0 nova_compute[192810]: 2025-09-30 21:35:04.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.035 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.131 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.part --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.132 2 DEBUG nova.virt.images [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] 01dc9202-d763-4a02-ba08-632f5a9921d6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.133 2 DEBUG nova.privsep.utils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.134 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.part /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.233 2 DEBUG nova.compute.manager [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.233 2 DEBUG oslo_concurrency.lockutils [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.233 2 DEBUG oslo_concurrency.lockutils [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.234 2 DEBUG oslo_concurrency.lockutils [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.234 2 DEBUG nova.compute.manager [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.234 2 WARNING nova.compute.manager [req-cefb4a1f-239e-4486-8345-468708f30d3d req-7ebc789c-dd61-4ab4-ab82-17afec224626 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state rescuing.
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.248 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.part /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.converted" returned: 0 in 0.115s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.257 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 podman[235350]: 2025-09-30 21:35:05.337742633 +0000 UTC m=+0.057728233 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.343 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61.converted --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.397 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "6b8d62627514116d042a4973bfd575f038df9d61" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.411 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "6b8d62627514116d042a4973bfd575f038df9d61" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.412 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "6b8d62627514116d042a4973bfd575f038df9d61" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-0 podman[235352]: 2025-09-30 21:35:05.417123575 +0000 UTC m=+0.121704809 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.423 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.481 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.482 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61,backing_fmt=raw /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.515 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61,backing_fmt=raw /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.rescue" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.517 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "6b8d62627514116d042a4973bfd575f038df9d61" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.517 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.530 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.533 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start _get_guest_xml network_info=[{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:03:b5:6d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '01dc9202-d763-4a02-ba08-632f5a9921d6', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.534 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.547 2 WARNING nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.553 2 DEBUG nova.virt.libvirt.host [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.554 2 DEBUG nova.virt.libvirt.host [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.557 2 DEBUG nova.virt.libvirt.host [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.557 2 DEBUG nova.virt.libvirt.host [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.559 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.559 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.560 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.560 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.560 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.560 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.561 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.561 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.561 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.562 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.562 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.562 2 DEBUG nova.virt.hardware [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.563 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.580 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.646 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.647 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.647 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.648 2 DEBUG oslo_concurrency.lockutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.649 2 DEBUG nova.virt.libvirt.vif [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-475541027',display_name='tempest-ServerStableDeviceRescueTest-server-475541027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-475541027',id=98,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:34:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-1jbnwm8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:34:53Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=4255e358-6db2-4947-a3d6-4e045f9235fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:03:b5:6d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.650 2 DEBUG nova.network.os_vif_util [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:03:b5:6d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.651 2 DEBUG nova.network.os_vif_util [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.652 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.669 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <uuid>4255e358-6db2-4947-a3d6-4e045f9235fb</uuid>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <name>instance-00000062</name>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-475541027</nova:name>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:35:05</nova:creationTime>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         <nova:port uuid="b0db3e99-94f0-42a8-a410-081e33fd0dee">
Sep 30 21:35:05 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <system>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="serial">4255e358-6db2-4947-a3d6-4e045f9235fb</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="uuid">4255e358-6db2-4947-a3d6-4e045f9235fb</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </system>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <os>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </os>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <features>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </features>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.rescue"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <target dev="sdb" bus="scsi"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <boot order="1"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:03:b5:6d"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <target dev="tapb0db3e99-94"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/console.log" append="off"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <video>
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </video>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:35:05 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:35:05 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:35:05 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:35:05 compute-0 nova_compute[192810]: </domain>
Sep 30 21:35:05 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.677 2 INFO nova.virt.libvirt.driver [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance destroyed successfully.
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.769 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.770 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.770 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.770 2 DEBUG nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:03:b5:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.771 2 INFO nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Using config drive
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.796 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:05 compute-0 nova_compute[192810]: 2025-09-30 21:35:05.838 2 DEBUG nova.objects.instance [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'keypairs' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:06 compute-0 nova_compute[192810]: 2025-09-30 21:35:06.878 2 INFO nova.virt.libvirt.driver [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Creating config drive at /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config.rescue
Sep 30 21:35:06 compute-0 nova_compute[192810]: 2025-09-30 21:35:06.882 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzqa2gl5n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.014 2 DEBUG oslo_concurrency.processutils [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzqa2gl5n" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:07 compute-0 kernel: tapb0db3e99-94: entered promiscuous mode
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.0973] manager: (tapb0db3e99-94): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Sep 30 21:35:07 compute-0 ovn_controller[94912]: 2025-09-30T21:35:07Z|00365|binding|INFO|Claiming lport b0db3e99-94f0-42a8-a410-081e33fd0dee for this chassis.
Sep 30 21:35:07 compute-0 ovn_controller[94912]: 2025-09-30T21:35:07Z|00366|binding|INFO|b0db3e99-94f0-42a8-a410-081e33fd0dee: Claiming fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.107 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.108 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.109 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:07 compute-0 ovn_controller[94912]: 2025-09-30T21:35:07Z|00367|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee ovn-installed in OVS
Sep 30 21:35:07 compute-0 ovn_controller[94912]: 2025-09-30T21:35:07Z|00368|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee up in Southbound
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.121 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb18449-42ab-465e-b634-683d6dd9d30c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.122 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.124 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.124 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[85c6b7f6-5c2e-4b24-b6d8-6a73fb1ee837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.124 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1ed051-303e-4433-bd72-cd3a4e4b489b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 systemd-udevd[235425]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.137 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[467ae828-b43e-4e7a-af36-963d5520e2bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.1403] device (tapb0db3e99-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.1411] device (tapb0db3e99-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:07 compute-0 systemd-machined[152794]: New machine qemu-46-instance-00000062.
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.161 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[52ec2eda-773e-4524-82f0-6a5940771aa3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 systemd[1]: Started Virtual Machine qemu-46-instance-00000062.
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.187 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[85820fa6-0a4e-45a4-a96d-280a6b938a9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 systemd-udevd[235431]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.1925] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/173)
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.191 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7f4cea6d-10e0-4739-b1b7-3bd1e2fc468b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.224 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[532b9c02-dfb8-42c3-b3ce-a6498a6e9fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.227 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7d57ee-9dd0-4bc2-8b58-e71f29b2a936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.2484] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.254 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dec07a74-e951-48e3-abeb-2f54b7c5e64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.270 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1df50e49-2063-4f6d-a0fa-a72db2dfced7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480287, 'reachable_time': 42213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235460, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.286 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c607bab1-3dce-4f6b-bf39-f78b2f227b6e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480287, 'tstamp': 480287}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235461, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.303 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2213c1-5ba6-4f6d-a0c8-36207b974bbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 110], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480287, 'reachable_time': 42213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235462, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.331 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0c5f5c-3da1-4c24-a5dd-40273ec33fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.379 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[532773cd-1ba2-4bcc-933f-68677d60425e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.380 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.380 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.381 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 NetworkManager[51733]: <info>  [1759268107.3850] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.388 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:07 compute-0 ovn_controller[94912]: 2025-09-30T21:35:07Z|00369|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.393 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.393 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5639f7e1-8c24-45db-ab6f-9392a8ac3382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.394 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:35:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:07.394 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.541 2 DEBUG nova.compute.manager [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.541 2 DEBUG oslo_concurrency.lockutils [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.541 2 DEBUG oslo_concurrency.lockutils [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.542 2 DEBUG oslo_concurrency.lockutils [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.542 2 DEBUG nova.compute.manager [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.542 2 WARNING nova.compute.manager [req-692a3a31-bf2a-4506-8518-e0b1c699a3e0 req-b8bb9305-92ae-4ad0-aa58-276cbe322d66 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state rescuing.
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:07 compute-0 podman[235501]: 2025-09-30 21:35:07.778108734 +0000 UTC m=+0.058597075 container create 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:35:07 compute-0 systemd[1]: Started libpod-conmon-95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51.scope.
Sep 30 21:35:07 compute-0 podman[235501]: 2025-09-30 21:35:07.749406522 +0000 UTC m=+0.029894913 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:35:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:35:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f0cdacc25b9795f90c5d4d2ccce1e275ed4fa10915bd548b66684a420803bd1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:35:07 compute-0 podman[235501]: 2025-09-30 21:35:07.863809331 +0000 UTC m=+0.144297722 container init 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:35:07 compute-0 podman[235501]: 2025-09-30 21:35:07.873720234 +0000 UTC m=+0.154208595 container start 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:35:07 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [NOTICE]   (235520) : New worker (235522) forked
Sep 30 21:35:07 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [NOTICE]   (235520) : Loading success.
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.920 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 4255e358-6db2-4947-a3d6-4e045f9235fb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.921 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268107.9203026, 4255e358-6db2-4947-a3d6-4e045f9235fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.921 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Resumed (Lifecycle Event)
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.937 2 DEBUG nova.compute.manager [None req-a60ed96d-c5e2-4650-9023-37ff990d4a0b 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.943 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.945 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.971 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.971 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268107.9214556, 4255e358-6db2-4947-a3d6-4e045f9235fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.971 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Started (Lifecycle Event)
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.994 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:07 compute-0 nova_compute[192810]: 2025-09-30 21:35:07.997 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.670 2 DEBUG nova.compute.manager [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.671 2 DEBUG oslo_concurrency.lockutils [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.672 2 DEBUG oslo_concurrency.lockutils [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.672 2 DEBUG oslo_concurrency.lockutils [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.672 2 DEBUG nova.compute.manager [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:09 compute-0 nova_compute[192810]: 2025-09-30 21:35:09.673 2 WARNING nova.compute.manager [req-eb4875d4-de57-415e-9743-67ae4f9ed593 req-7cb16565-b488-4a24-8c77-85cb5581f028 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state rescued and task_state None.
Sep 30 21:35:10 compute-0 nova_compute[192810]: 2025-09-30 21:35:10.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:10 compute-0 nova_compute[192810]: 2025-09-30 21:35:10.140 2 INFO nova.compute.manager [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Unrescuing
Sep 30 21:35:10 compute-0 nova_compute[192810]: 2025-09-30 21:35:10.140 2 DEBUG oslo_concurrency.lockutils [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:10 compute-0 nova_compute[192810]: 2025-09-30 21:35:10.140 2 DEBUG oslo_concurrency.lockutils [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:10 compute-0 nova_compute[192810]: 2025-09-30 21:35:10.141 2 DEBUG nova.network.neutron [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:12 compute-0 podman[235532]: 2025-09-30 21:35:12.324014204 +0000 UTC m=+0.059756453 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:35:12 compute-0 podman[235531]: 2025-09-30 21:35:12.324457325 +0000 UTC m=+0.061009373 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:35:12 compute-0 podman[235533]: 2025-09-30 21:35:12.357530835 +0000 UTC m=+0.090688850 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:35:12 compute-0 nova_compute[192810]: 2025-09-30 21:35:12.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.314 2 DEBUG nova.network.neutron [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.336 2 DEBUG oslo_concurrency.lockutils [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.337 2 DEBUG nova.objects.instance [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'flavor' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:13 compute-0 kernel: tapb0db3e99-94 (unregistering): left promiscuous mode
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.4231] device (tapb0db3e99-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00370|binding|INFO|Releasing lport b0db3e99-94f0-42a8-a410-081e33fd0dee from this chassis (sb_readonly=0)
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00371|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee down in Southbound
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00372|binding|INFO|Removing iface tapb0db3e99-94 ovn-installed in OVS
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.446 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.447 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.449 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.450 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2400bc17-594a-4cbd-9c33-042d3b5602f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.450 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:35:13 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000062.scope: Deactivated successfully.
Sep 30 21:35:13 compute-0 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d00000062.scope: Consumed 6.279s CPU time.
Sep 30 21:35:13 compute-0 systemd-machined[152794]: Machine qemu-46-instance-00000062 terminated.
Sep 30 21:35:13 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [NOTICE]   (235520) : haproxy version is 2.8.14-c23fe91
Sep 30 21:35:13 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [NOTICE]   (235520) : path to executable is /usr/sbin/haproxy
Sep 30 21:35:13 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [WARNING]  (235520) : Exiting Master process...
Sep 30 21:35:13 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [ALERT]    (235520) : Current worker (235522) exited with code 143 (Terminated)
Sep 30 21:35:13 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235516]: [WARNING]  (235520) : All workers exited. Exiting... (0)
Sep 30 21:35:13 compute-0 systemd[1]: libpod-95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51.scope: Deactivated successfully.
Sep 30 21:35:13 compute-0 podman[235611]: 2025-09-30 21:35:13.586739741 +0000 UTC m=+0.054348140 container died 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:35:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51-userdata-shm.mount: Deactivated successfully.
Sep 30 21:35:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-5f0cdacc25b9795f90c5d4d2ccce1e275ed4fa10915bd548b66684a420803bd1-merged.mount: Deactivated successfully.
Sep 30 21:35:13 compute-0 podman[235611]: 2025-09-30 21:35:13.629489027 +0000 UTC m=+0.097097426 container cleanup 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:35:13 compute-0 systemd[1]: libpod-conmon-95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51.scope: Deactivated successfully.
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.675 2 INFO nova.virt.libvirt.driver [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance destroyed successfully.
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.675 2 DEBUG nova.objects.instance [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:13 compute-0 podman[235654]: 2025-09-30 21:35:13.691207688 +0000 UTC m=+0.043573098 container remove 95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.696 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e84b12-3067-405b-b31d-c7f0e52bbf95]: (4, ('Tue Sep 30 09:35:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51)\n95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51\nTue Sep 30 09:35:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51)\n95a418b8c3ec39cc9389c9bea41d80c6b1bf3292b4623aef8575891cc9260e51\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.697 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cda66bc-a64f-43cd-a763-7a497ac33bb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.698 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.718 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42fabc7e-5428-4cb4-adf1-37abe49ce07c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.752 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[99461e45-c65a-455b-a26e-853cc443c04e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.754 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[972ccbe2-a09d-4e93-807f-335cba5ad79d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 kernel: tapb0db3e99-94: entered promiscuous mode
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.7685] manager: (tapb0db3e99-94): new Tun device (/org/freedesktop/NetworkManager/Devices/175)
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00373|binding|INFO|Claiming lport b0db3e99-94f0-42a8-a410-081e33fd0dee for this chassis.
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00374|binding|INFO|b0db3e99-94f0-42a8-a410-081e33fd0dee: Claiming fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:35:13 compute-0 systemd-udevd[235590]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.771 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[02e6dbf4-733e-4c6f-bf13-eececde006b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480280, 'reachable_time': 31001, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235694, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.782 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.7851] device (tapb0db3e99-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.7860] device (tapb0db3e99-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.786 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00375|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee ovn-installed in OVS
Sep 30 21:35:13 compute-0 ovn_controller[94912]: 2025-09-30T21:35:13Z|00376|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee up in Southbound
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.786 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[0686e9b2-f1d1-400a-b0d1-b82e41cc3cbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.788 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 nova_compute[192810]: 2025-09-30 21:35:13.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.791 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.802 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c94ee432-d1c2-4cee-8c75-3188a026e62c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.804 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf5a6396a-b1 in ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.806 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf5a6396a-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.806 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6562b78b-bce9-4f4b-958b-fcea08f69f03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.806 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[850e5107-0b40-4fad-89d5-fd90373f0387]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 systemd-machined[152794]: New machine qemu-47-instance-00000062.
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.818 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[c5442ef1-5f8d-4186-85ed-d5ac79e556c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.832 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[28321bf8-3579-456f-9570-0fa990a49c91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 systemd[1]: Started Virtual Machine qemu-47-instance-00000062.
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.876 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[44f4a43b-847d-4d04-8cab-b0c56adb088c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.8827] manager: (tapf5a6396a-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.881 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ad237df7-6386-4666-a817-82792d09696a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.920 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ac54a595-b74a-46d5-9c94-46b734a3da25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.923 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5cb84c-5cb0-4dab-84a5-fd71d065187b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 NetworkManager[51733]: <info>  [1759268113.9480] device (tapf5a6396a-b0): carrier: link connected
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.954 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[447c8d41-9837-4184-b9b6-590e9df838bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.969 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb08de3-5a1c-4683-909d-32feaa6df137]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235734, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:13.985 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b7fdfd09-9e85-4a9e-b1e2-e8d9dfca2aaf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:66d5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480957, 'tstamp': 480957}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235735, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.001 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d18ba2e2-ab06-4f90-b998-f3df6731c76c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 235736, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.031 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3a76472a-ac3d-4f0e-a033-3e718974e9c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.084 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8e0e59aa-ebdf-499b-8f2b-fa04eecd2c76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.085 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.085 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.086 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:14 compute-0 NetworkManager[51733]: <info>  [1759268114.0883] manager: (tapf5a6396a-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Sep 30 21:35:14 compute-0 kernel: tapf5a6396a-b0: entered promiscuous mode
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.092 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:14 compute-0 ovn_controller[94912]: 2025-09-30T21:35:14Z|00377|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.106 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.107 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d69b992e-1ceb-4143-ad93-3d087b71d3af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.107 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/f5a6396a-b7b7-4ff1-a2af-27477fea2815.pid.haproxy
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:35:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:14.108 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'env', 'PROCESS_TAG=haproxy-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f5a6396a-b7b7-4ff1-a2af-27477fea2815.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:35:14 compute-0 unix_chkpwd[235753]: password check failed for user (root)
Sep 30 21:35:14 compute-0 sshd-session[235684]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.315 2 DEBUG nova.compute.manager [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.315 2 DEBUG oslo_concurrency.lockutils [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.315 2 DEBUG oslo_concurrency.lockutils [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.315 2 DEBUG oslo_concurrency.lockutils [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.316 2 DEBUG nova.compute.manager [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.316 2 WARNING nova.compute.manager [req-3769f18e-d706-4880-bf68-085c65850e7a req-fea0bbf4-8583-454a-887b-7502839b9fd9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:14 compute-0 podman[235776]: 2025-09-30 21:35:14.465665466 +0000 UTC m=+0.045657608 container create e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:35:14 compute-0 systemd[1]: Started libpod-conmon-e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad.scope.
Sep 30 21:35:14 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:35:14 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f30699ad1b94d270b7acb13d7d964de370b5ee1ea379ad52fff596e94239a7cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:35:14 compute-0 podman[235776]: 2025-09-30 21:35:14.44209888 +0000 UTC m=+0.022091052 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:35:14 compute-0 podman[235776]: 2025-09-30 21:35:14.541996574 +0000 UTC m=+0.121988746 container init e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:35:14 compute-0 podman[235776]: 2025-09-30 21:35:14.547162901 +0000 UTC m=+0.127155053 container start e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.556 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 4255e358-6db2-4947-a3d6-4e045f9235fb due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.556 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268114.5558445, 4255e358-6db2-4947-a3d6-4e045f9235fb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.556 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Resumed (Lifecycle Event)
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.559 2 DEBUG nova.compute.manager [None req-14a472cd-4b6b-45a0-b614-61a2944128ac 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:14 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [NOTICE]   (235795) : New worker (235797) forked
Sep 30 21:35:14 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [NOTICE]   (235795) : Loading success.
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.688 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.691 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.724 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268114.5565312, 4255e358-6db2-4947-a3d6-4e045f9235fb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.724 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Started (Lifecycle Event)
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.760 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:14 compute-0 nova_compute[192810]: 2025-09-30 21:35:14.763 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:15 compute-0 nova_compute[192810]: 2025-09-30 21:35:15.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:16 compute-0 sshd-session[235684]: Failed password for root from 45.81.23.80 port 57568 ssh2
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.409 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.410 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.410 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.411 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.411 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.412 2 WARNING nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state None.
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.412 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.413 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.413 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.414 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.414 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.415 2 WARNING nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state None.
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.415 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.416 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.416 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.417 2 DEBUG oslo_concurrency.lockutils [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.417 2 DEBUG nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:16 compute-0 nova_compute[192810]: 2025-09-30 21:35:16.418 2 WARNING nova.compute.manager [req-137a19c8-2ce8-4a4c-abf7-2c07112fec43 req-f21fbb9d-c812-492e-b40f-43ef94345d75 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state active and task_state None.
Sep 30 21:35:17 compute-0 nova_compute[192810]: 2025-09-30 21:35:17.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:18 compute-0 sshd-session[235684]: Received disconnect from 45.81.23.80 port 57568:11: Bye Bye [preauth]
Sep 30 21:35:18 compute-0 sshd-session[235684]: Disconnected from authenticating user root 45.81.23.80 port 57568 [preauth]
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.341 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.341 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.385 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.547 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.548 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.556 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.557 2 INFO nova.compute.claims [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.704 2 DEBUG nova.compute.provider_tree [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.722 2 DEBUG nova.scheduler.client.report [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.747 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.747 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.838 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.838 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.877 2 INFO nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:35:20 compute-0 nova_compute[192810]: 2025-09-30 21:35:20.915 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.135 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.137 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.137 2 INFO nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Creating image(s)
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.138 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.138 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.140 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.157 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.228 2 DEBUG nova.policy [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.232 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.232 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.233 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.250 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.313 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.314 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.347 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.348 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.348 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.400 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.402 2 DEBUG nova.virt.disk.api [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Checking if we can resize image /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.402 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.461 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.462 2 DEBUG nova.virt.disk.api [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Cannot resize image /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.463 2 DEBUG nova.objects.instance [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.486 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.486 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Ensure instance console log exists: /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.487 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.487 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:21 compute-0 nova_compute[192810]: 2025-09-30 21:35:21.488 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:22.098 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:22.100 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:35:22 compute-0 nova_compute[192810]: 2025-09-30 21:35:22.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:22 compute-0 nova_compute[192810]: 2025-09-30 21:35:22.368 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Successfully created port: b275e55b-7c62-4b31-a65b-129241b9139d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:35:22 compute-0 nova_compute[192810]: 2025-09-30 21:35:22.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.014 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Successfully updated port: b275e55b-7c62-4b31-a65b-129241b9139d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.031 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.031 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.032 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.143 2 DEBUG nova.compute.manager [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-changed-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.143 2 DEBUG nova.compute.manager [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Refreshing instance network info cache due to event network-changed-b275e55b-7c62-4b31-a65b-129241b9139d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.143 2 DEBUG oslo_concurrency.lockutils [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:24 compute-0 nova_compute[192810]: 2025-09-30 21:35:24.218 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:35:24 compute-0 podman[235824]: 2025-09-30 21:35:24.33533884 +0000 UTC m=+0.065424662 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:35:24 compute-0 podman[235823]: 2025-09-30 21:35:24.368593644 +0000 UTC m=+0.094428842 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:35:24 compute-0 podman[235861]: 2025-09-30 21:35:24.439487079 +0000 UTC m=+0.081807613 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.160 2 DEBUG nova.network.neutron [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updating instance_info_cache with network_info: [{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.204 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.205 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance network_info: |[{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.205 2 DEBUG oslo_concurrency.lockutils [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.205 2 DEBUG nova.network.neutron [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Refreshing network info cache for port b275e55b-7c62-4b31-a65b-129241b9139d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.208 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start _get_guest_xml network_info=[{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.212 2 WARNING nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.217 2 DEBUG nova.virt.libvirt.host [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.218 2 DEBUG nova.virt.libvirt.host [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.221 2 DEBUG nova.virt.libvirt.host [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.221 2 DEBUG nova.virt.libvirt.host [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.222 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.222 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.223 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.223 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.223 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.223 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.224 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.224 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.224 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.224 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.224 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.225 2 DEBUG nova.virt.hardware [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.228 2 DEBUG nova.virt.libvirt.vif [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-522911333',display_name='tempest-ServerStableDeviceRescueTest-server-522911333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-522911333',id=102,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-glkb8fbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:20Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=128d5729-9e60-43d9-b1a4-fa8fb6e0e619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.228 2 DEBUG nova.network.os_vif_util [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.229 2 DEBUG nova.network.os_vif_util [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.230 2 DEBUG nova.objects.instance [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.247 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <uuid>128d5729-9e60-43d9-b1a4-fa8fb6e0e619</uuid>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <name>instance-00000066</name>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-522911333</nova:name>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:35:25</nova:creationTime>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         <nova:port uuid="b275e55b-7c62-4b31-a65b-129241b9139d">
Sep 30 21:35:25 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <system>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="serial">128d5729-9e60-43d9-b1a4-fa8fb6e0e619</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="uuid">128d5729-9e60-43d9-b1a4-fa8fb6e0e619</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </system>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <os>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </os>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <features>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </features>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:86:d7:f5"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <target dev="tapb275e55b-7c"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/console.log" append="off"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <video>
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </video>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:35:25 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:35:25 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:35:25 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:35:25 compute-0 nova_compute[192810]: </domain>
Sep 30 21:35:25 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.248 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Preparing to wait for external event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.248 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.248 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.249 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.249 2 DEBUG nova.virt.libvirt.vif [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-522911333',display_name='tempest-ServerStableDeviceRescueTest-server-522911333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-522911333',id=102,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-glkb8fbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:20Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=128d5729-9e60-43d9-b1a4-fa8fb6e0e619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.249 2 DEBUG nova.network.os_vif_util [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.250 2 DEBUG nova.network.os_vif_util [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.250 2 DEBUG os_vif [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb275e55b-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb275e55b-7c, col_values=(('external_ids', {'iface-id': 'b275e55b-7c62-4b31-a65b-129241b9139d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:d7:f5', 'vm-uuid': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 NetworkManager[51733]: <info>  [1759268125.2579] manager: (tapb275e55b-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/178)
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.265 2 INFO os_vif [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c')
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.332 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.333 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.333 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:86:d7:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.334 2 INFO nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Using config drive
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.768 2 INFO nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Creating config drive at /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.772 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjerfpfuw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.896 2 DEBUG oslo_concurrency.processutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjerfpfuw" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:25 compute-0 kernel: tapb275e55b-7c: entered promiscuous mode
Sep 30 21:35:25 compute-0 NetworkManager[51733]: <info>  [1759268125.9959] manager: (tapb275e55b-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Sep 30 21:35:25 compute-0 nova_compute[192810]: 2025-09-30 21:35:25.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:25 compute-0 ovn_controller[94912]: 2025-09-30T21:35:25Z|00378|binding|INFO|Claiming lport b275e55b-7c62-4b31-a65b-129241b9139d for this chassis.
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00379|binding|INFO|b275e55b-7c62-4b31-a65b-129241b9139d: Claiming fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.013 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.014 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.016 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00380|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d ovn-installed in OVS
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00381|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d up in Southbound
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.042 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[12c669cc-2617-4f3b-9e85-b430ff8d1076]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 systemd-udevd[235918]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:26 compute-0 NetworkManager[51733]: <info>  [1759268126.0692] device (tapb275e55b-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:26 compute-0 NetworkManager[51733]: <info>  [1759268126.0703] device (tapb275e55b-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:26 compute-0 systemd-machined[152794]: New machine qemu-48-instance-00000066.
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.091 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[19f22428-1e01-4e4b-930d-2d90c53317e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.096 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c48dcc94-a652-47d2-9f9d-6a3a4265d1f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 systemd[1]: Started Virtual Machine qemu-48-instance-00000066.
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.136 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad991d4-5a84-431e-9c5d-775c87ad6418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.167 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[03aeff4c-a3c6-468c-b0ae-c25192aacbdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 832, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235926, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.192 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[af01415c-472c-4b34-9e4f-31d147eb3e33]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235932, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 235932, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.193 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.196 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.196 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.197 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:26.197 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.556 2 DEBUG nova.network.neutron [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updated VIF entry in instance network info cache for port b275e55b-7c62-4b31-a65b-129241b9139d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.557 2 DEBUG nova.network.neutron [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updating instance_info_cache with network_info: [{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.578 2 DEBUG oslo_concurrency.lockutils [req-0dcd19de-d2e6-4a8d-afe8-cf9213d98b69 req-22869ad2-de7d-4627-8428-8d357336e58d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00382|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00383|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.823 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268126.8228748, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.824 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Started (Lifecycle Event)
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.851 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.873 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268126.8253884, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.873 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Paused (Lifecycle Event)
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.894 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.897 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.925 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00384|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:35:26 compute-0 ovn_controller[94912]: 2025-09-30T21:35:26Z|00385|binding|INFO|Releasing lport 3996e682-c20c-41c5-9547-9688a18f316c from this chassis (sb_readonly=0)
Sep 30 21:35:26 compute-0 nova_compute[192810]: 2025-09-30 21:35:26.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:27 compute-0 ovn_controller[94912]: 2025-09-30T21:35:27Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:35:27 compute-0 ovn_controller[94912]: 2025-09-30T21:35:27Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:03:b5:6d 10.100.0.6
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.574 2 DEBUG nova.compute.manager [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.574 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.574 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.574 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.575 2 DEBUG nova.compute.manager [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Processing event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.575 2 DEBUG nova.compute.manager [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.575 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.575 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.575 2 DEBUG oslo_concurrency.lockutils [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.576 2 DEBUG nova.compute.manager [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.576 2 WARNING nova.compute.manager [req-e8840a95-4511-4ae3-ab06-3a77d1e1154d req-d8ceb576-1f47-4162-9df4-8780db45ce10 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state building and task_state spawning.
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.576 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.579 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268127.5796244, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.580 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Resumed (Lifecycle Event)
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.581 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.584 2 INFO nova.virt.libvirt.driver [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance spawned successfully.
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.585 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.605 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.612 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.618 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.618 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.619 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.619 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.620 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.620 2 DEBUG nova.virt.libvirt.driver [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.664 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.693 2 INFO nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Took 6.56 seconds to spawn the instance on the hypervisor.
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.694 2 DEBUG nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.785 2 INFO nova.compute.manager [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Took 7.32 seconds to build instance.
Sep 30 21:35:27 compute-0 nova_compute[192810]: 2025-09-30 21:35:27.805 2 DEBUG oslo_concurrency.lockutils [None req-8bb13da3-77e8-4756-9da1-9f2f4b3be449 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.464s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.518 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.519 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.520 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.520 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.520 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.532 2 INFO nova.compute.manager [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Terminating instance
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.544 2 DEBUG nova.compute.manager [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:35:28 compute-0 kernel: tape2cc5531-35 (unregistering): left promiscuous mode
Sep 30 21:35:28 compute-0 NetworkManager[51733]: <info>  [1759268128.5783] device (tape2cc5531-35): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:28 compute-0 ovn_controller[94912]: 2025-09-30T21:35:28Z|00386|binding|INFO|Releasing lport e2cc5531-350c-4845-bde9-9797e9628421 from this chassis (sb_readonly=0)
Sep 30 21:35:28 compute-0 ovn_controller[94912]: 2025-09-30T21:35:28Z|00387|binding|INFO|Setting lport e2cc5531-350c-4845-bde9-9797e9628421 down in Southbound
Sep 30 21:35:28 compute-0 ovn_controller[94912]: 2025-09-30T21:35:28Z|00388|binding|INFO|Removing iface tape2cc5531-35 ovn-installed in OVS
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:28.602 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:a2:bc 10.100.0.7'], port_security=['fa:16:3e:3f:a2:bc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f92e449c-90c2-4cba-a8c1-ba6b1c82d770', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd876c85b6ca5418eb657e48391a6503b', 'neutron:revision_number': '4', 'neutron:security_group_ids': '59aeae25-c90e-4b40-9e86-3ac03fe94073', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b003c3b3-124e-4f30-8c82-ee588d17c214, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e2cc5531-350c-4845-bde9-9797e9628421) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:28.604 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e2cc5531-350c-4845-bde9-9797e9628421 in datapath 91c84c55-96ab-4682-a6e7-9e96514ca8a5 unbound from our chassis
Sep 30 21:35:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:28.605 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91c84c55-96ab-4682-a6e7-9e96514ca8a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:35:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:28.616 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[975fabb6-3358-40f4-b0b3-6bf6f838115b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:28.617 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 namespace which is not needed anymore
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:28 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Sep 30 21:35:28 compute-0 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d0000005a.scope: Consumed 17.299s CPU time.
Sep 30 21:35:28 compute-0 systemd-machined[152794]: Machine qemu-42-instance-0000005a terminated.
Sep 30 21:35:28 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [NOTICE]   (233945) : haproxy version is 2.8.14-c23fe91
Sep 30 21:35:28 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [NOTICE]   (233945) : path to executable is /usr/sbin/haproxy
Sep 30 21:35:28 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [WARNING]  (233945) : Exiting Master process...
Sep 30 21:35:28 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [ALERT]    (233945) : Current worker (233947) exited with code 143 (Terminated)
Sep 30 21:35:28 compute-0 neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5[233941]: [WARNING]  (233945) : All workers exited. Exiting... (0)
Sep 30 21:35:28 compute-0 systemd[1]: libpod-5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3.scope: Deactivated successfully.
Sep 30 21:35:28 compute-0 podman[235964]: 2025-09-30 21:35:28.81178191 +0000 UTC m=+0.079006214 container died 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.832 2 INFO nova.virt.libvirt.driver [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Instance destroyed successfully.
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.833 2 DEBUG nova.objects.instance [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lazy-loading 'resources' on Instance uuid f92e449c-90c2-4cba-a8c1-ba6b1c82d770 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.846 2 DEBUG nova.virt.libvirt.vif [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:32:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1093237178',display_name='tempest-ServerActionsTestOtherB-server-1093237178',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1093237178',id=90,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:32:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d876c85b6ca5418eb657e48391a6503b',ramdisk_id='',reservation_id='r-hu7yzh19',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-463525410',owner_user_name='tempest-ServerActionsTestOtherB-463525410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:32:59Z,user_data=None,user_id='b9b3e9f2523944539f57a1ff5d565cb4',uuid=f92e449c-90c2-4cba-a8c1-ba6b1c82d770,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.847 2 DEBUG nova.network.os_vif_util [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converting VIF {"id": "e2cc5531-350c-4845-bde9-9797e9628421", "address": "fa:16:3e:3f:a2:bc", "network": {"id": "91c84c55-96ab-4682-a6e7-9e96514ca8a5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1696557468-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d876c85b6ca5418eb657e48391a6503b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape2cc5531-35", "ovs_interfaceid": "e2cc5531-350c-4845-bde9-9797e9628421", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.848 2 DEBUG nova.network.os_vif_util [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.848 2 DEBUG os_vif [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.850 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape2cc5531-35, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.857 2 INFO os_vif [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:a2:bc,bridge_name='br-int',has_traffic_filtering=True,id=e2cc5531-350c-4845-bde9-9797e9628421,network=Network(91c84c55-96ab-4682-a6e7-9e96514ca8a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape2cc5531-35')
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.857 2 INFO nova.virt.libvirt.driver [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Deleting instance files /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770_del
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.858 2 INFO nova.virt.libvirt.driver [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Deletion of /var/lib/nova/instances/f92e449c-90c2-4cba-a8c1-ba6b1c82d770_del complete
Sep 30 21:35:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3-userdata-shm.mount: Deactivated successfully.
Sep 30 21:35:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-dea51daa6c05ad8e442a6c36df4ddf9123bca49c77b97eb2592a3f966898bb23-merged.mount: Deactivated successfully.
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.923 2 INFO nova.compute.manager [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.924 2 DEBUG oslo.service.loopingcall [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.924 2 DEBUG nova.compute.manager [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:35:28 compute-0 nova_compute[192810]: 2025-09-30 21:35:28.924 2 DEBUG nova.network.neutron [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:35:28 compute-0 podman[235964]: 2025-09-30 21:35:28.92541277 +0000 UTC m=+0.192637094 container cleanup 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:35:28 compute-0 systemd[1]: libpod-conmon-5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3.scope: Deactivated successfully.
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.101 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:29 compute-0 podman[236007]: 2025-09-30 21:35:29.122867542 +0000 UTC m=+0.161836631 container remove 5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.132 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4a851bfc-6bc4-4bdb-80eb-efa16c3bc35f]: (4, ('Tue Sep 30 09:35:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3)\n5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3\nTue Sep 30 09:35:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 (5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3)\n5553e48e4d6703cd90d262aabd6853c48c8f9c86c3e9d5a6caeb1531dfdde7f3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.134 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a747e236-8f93-433d-8af4-3df47436cb9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.136 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91c84c55-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:29 compute-0 kernel: tap91c84c55-90: left promiscuous mode
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.157 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[97f25ad8-c047-47d1-83dd-2bcfc4f59ed3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.186 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[da1e8e8b-eb2a-4b82-a12c-4e010930b97b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.187 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e1d516-9f47-43ad-988c-f79733287af7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.220 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a632fa-b88c-4262-b524-24b539d7c9fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 467428, 'reachable_time': 32188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236022, 'error': None, 'target': 'ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.222 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-91c84c55-96ab-4682-a6e7-9e96514ca8a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:35:29 compute-0 systemd[1]: run-netns-ovnmeta\x2d91c84c55\x2d96ab\x2d4682\x2da6e7\x2d9e96514ca8a5.mount: Deactivated successfully.
Sep 30 21:35:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:29.223 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d8313566-0b0f-4d49-a9e3-288909f7acdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.596 2 DEBUG nova.network.neutron [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.615 2 INFO nova.compute.manager [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Took 0.69 seconds to deallocate network for instance.
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.665 2 DEBUG nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-unplugged-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.665 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.666 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.666 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.666 2 DEBUG nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] No waiting events found dispatching network-vif-unplugged-e2cc5531-350c-4845-bde9-9797e9628421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.667 2 DEBUG nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-unplugged-e2cc5531-350c-4845-bde9-9797e9628421 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.667 2 DEBUG nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.667 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.668 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.668 2 DEBUG oslo_concurrency.lockutils [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.668 2 DEBUG nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] No waiting events found dispatching network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.668 2 WARNING nova.compute.manager [req-a9d30953-ade5-4969-833d-4776da93207a req-bdccf24b-8f66-4f3c-918e-6fd337ce5af3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received unexpected event network-vif-plugged-e2cc5531-350c-4845-bde9-9797e9628421 for instance with vm_state active and task_state deleting.
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.687 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.688 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.714 2 DEBUG nova.compute.manager [req-3de35961-088e-4fc9-962d-c1da9c378822 req-bc08ca08-4f33-45db-88b4-47f471f9a7ec dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Received event network-vif-deleted-e2cc5531-350c-4845-bde9-9797e9628421 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.778 2 DEBUG nova.compute.provider_tree [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.797 2 DEBUG nova.scheduler.client.report [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.822 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.862 2 INFO nova.scheduler.client.report [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Deleted allocations for instance f92e449c-90c2-4cba-a8c1-ba6b1c82d770
Sep 30 21:35:29 compute-0 nova_compute[192810]: 2025-09-30 21:35:29.927 2 DEBUG oslo_concurrency.lockutils [None req-db63d0ac-0edf-4a8a-b8ca-900894f1595e b9b3e9f2523944539f57a1ff5d565cb4 d876c85b6ca5418eb657e48391a6503b - - default default] Lock "f92e449c-90c2-4cba-a8c1-ba6b1c82d770" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:30 compute-0 nova_compute[192810]: 2025-09-30 21:35:30.968 2 DEBUG nova.compute.manager [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.020 2 INFO nova.compute.manager [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] instance snapshotting
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.315 2 INFO nova.virt.libvirt.driver [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Beginning live snapshot process
Sep 30 21:35:31 compute-0 virtqemud[192233]: invalid argument: disk vda does not have an active block job
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.604 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.663 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json -f qcow2" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.664 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.722 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.734 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.796 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.797 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.827 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829.delta 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.828 2 INFO nova.virt.libvirt.driver [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:35:31 compute-0 nova_compute[192810]: 2025-09-30 21:35:31.876 2 DEBUG nova.virt.libvirt.guest [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] COPY block job progress, current cursor: 0 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.380 2 DEBUG nova.virt.libvirt.guest [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] COPY block job progress, current cursor: 1 final cursor: 1 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.383 2 INFO nova.virt.libvirt.driver [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.425 2 DEBUG nova.privsep.utils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.426 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829.delta /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.557 2 DEBUG oslo_concurrency.processutils [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829.delta /var/lib/nova/instances/snapshots/tmp2v779k22/e38e493ee6de4368aef6d0eb59e86829" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.558 2 INFO nova.virt.libvirt.driver [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Snapshot extracted, beginning image upload
Sep 30 21:35:32 compute-0 nova_compute[192810]: 2025-09-30 21:35:32.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:33 compute-0 nova_compute[192810]: 2025-09-30 21:35:33.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:34 compute-0 nova_compute[192810]: 2025-09-30 21:35:34.810 2 INFO nova.virt.libvirt.driver [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Snapshot image upload complete
Sep 30 21:35:34 compute-0 nova_compute[192810]: 2025-09-30 21:35:34.811 2 INFO nova.compute.manager [None req-efe8ddc2-8bdc-4ad3-86c6-5445a00f8459 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Took 3.78 seconds to snapshot the instance on the hypervisor.
Sep 30 21:35:35 compute-0 ovn_controller[94912]: 2025-09-30T21:35:35Z|00389|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:35:35 compute-0 nova_compute[192810]: 2025-09-30 21:35:35.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:36 compute-0 podman[236051]: 2025-09-30 21:35:36.36362106 +0000 UTC m=+0.084426057 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:35:36 compute-0 podman[236052]: 2025-09-30 21:35:36.372092107 +0000 UTC m=+0.101315000 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Sep 30 21:35:37 compute-0 nova_compute[192810]: 2025-09-30 21:35:37.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:38.740 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:38.741 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:38.741 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:38 compute-0 nova_compute[192810]: 2025-09-30 21:35:38.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:39 compute-0 nova_compute[192810]: 2025-09-30 21:35:39.025 2 INFO nova.compute.manager [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Rescuing
Sep 30 21:35:39 compute-0 nova_compute[192810]: 2025-09-30 21:35:39.025 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:39 compute-0 nova_compute[192810]: 2025-09-30 21:35:39.025 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:39 compute-0 nova_compute[192810]: 2025-09-30 21:35:39.026 2 DEBUG nova.network.neutron [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:41 compute-0 ovn_controller[94912]: 2025-09-30T21:35:41Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:35:41 compute-0 ovn_controller[94912]: 2025-09-30T21:35:41Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:35:42 compute-0 nova_compute[192810]: 2025-09-30 21:35:42.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:42 compute-0 nova_compute[192810]: 2025-09-30 21:35:42.617 2 DEBUG nova.network.neutron [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updating instance_info_cache with network_info: [{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:42 compute-0 nova_compute[192810]: 2025-09-30 21:35:42.645 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:43 compute-0 nova_compute[192810]: 2025-09-30 21:35:43.252 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:35:43 compute-0 podman[236113]: 2025-09-30 21:35:43.327958496 +0000 UTC m=+0.051860820 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:35:43 compute-0 podman[236112]: 2025-09-30 21:35:43.330951789 +0000 UTC m=+0.059443376 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:35:43 compute-0 podman[236111]: 2025-09-30 21:35:43.330934028 +0000 UTC m=+0.060849070 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:35:43 compute-0 nova_compute[192810]: 2025-09-30 21:35:43.831 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268128.8297803, f92e449c-90c2-4cba-a8c1-ba6b1c82d770 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:43 compute-0 nova_compute[192810]: 2025-09-30 21:35:43.831 2 INFO nova.compute.manager [-] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] VM Stopped (Lifecycle Event)
Sep 30 21:35:43 compute-0 nova_compute[192810]: 2025-09-30 21:35:43.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:43 compute-0 nova_compute[192810]: 2025-09-30 21:35:43.866 2 DEBUG nova.compute.manager [None req-25da7944-c232-455a-8082-e1da112061b3 - - - - - -] [instance: f92e449c-90c2-4cba-a8c1-ba6b1c82d770] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:45 compute-0 kernel: tapb275e55b-7c (unregistering): left promiscuous mode
Sep 30 21:35:45 compute-0 NetworkManager[51733]: <info>  [1759268145.4529] device (tapb275e55b-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:45 compute-0 ovn_controller[94912]: 2025-09-30T21:35:45Z|00390|binding|INFO|Releasing lport b275e55b-7c62-4b31-a65b-129241b9139d from this chassis (sb_readonly=0)
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 ovn_controller[94912]: 2025-09-30T21:35:45Z|00391|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d down in Southbound
Sep 30 21:35:45 compute-0 ovn_controller[94912]: 2025-09-30T21:35:45Z|00392|binding|INFO|Removing iface tapb275e55b-7c ovn-installed in OVS
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.473 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.474 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.475 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.496 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a29838b8-7504-476a-bdc8-e5f028be830d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000066.scope: Deactivated successfully.
Sep 30 21:35:45 compute-0 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d00000066.scope: Consumed 12.987s CPU time.
Sep 30 21:35:45 compute-0 systemd-machined[152794]: Machine qemu-48-instance-00000066 terminated.
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.526 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c747186f-8919-4107-afb3-fd5fe100bb9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.530 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[96a72110-40d3-4da1-9216-afa36e3e0d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.557 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b97ef9-4428-4f7c-959d-4a7abaa55279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.573 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e70a6177-2baf-478e-bfdd-7248f9ca3e35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 916, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236186, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.587 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bec0f80c-5283-43d6-b63b-1509ac41106f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236187, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236187, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.589 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.595 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.596 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.596 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:45.596 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:45 compute-0 nova_compute[192810]: 2025-09-30 21:35:45.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.161 2 DEBUG nova.compute.manager [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.161 2 DEBUG oslo_concurrency.lockutils [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.161 2 DEBUG oslo_concurrency.lockutils [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.162 2 DEBUG oslo_concurrency.lockutils [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.162 2 DEBUG nova.compute.manager [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.162 2 WARNING nova.compute.manager [req-1d0e225c-df2a-4e79-b070-cfb90bb98e53 req-3f8cd0cf-4a38-4562-880d-8c5897c99b46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state active and task_state rescuing.
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.269 2 INFO nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance shutdown successfully after 3 seconds.
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.273 2 INFO nova.virt.libvirt.driver [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance destroyed successfully.
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.273 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.289 2 INFO nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Attempting a stable device rescue
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.726 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.729 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.730 2 INFO nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Creating image(s)
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.730 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.731 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.731 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.731 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'trusted_certs' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.745 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.745 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:46 compute-0 nova_compute[192810]: 2025-09-30 21:35:46.804 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:47 compute-0 nova_compute[192810]: 2025-09-30 21:35:47.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.302 2 DEBUG nova.compute.manager [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.303 2 DEBUG oslo_concurrency.lockutils [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.303 2 DEBUG oslo_concurrency.lockutils [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.303 2 DEBUG oslo_concurrency.lockutils [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.304 2 DEBUG nova.compute.manager [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.304 2 WARNING nova.compute.manager [req-5e42f756-1825-49f5-97b2-8e46c692fb80 req-714a18fe-828c-4a13-bed0-0558bfbbb27e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state active and task_state rescuing.
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.511 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.580 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.part --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.581 2 DEBUG nova.virt.images [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] 7497a8b9-5d02-4413-9b0c-140c8fc32c90 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.582 2 DEBUG nova.privsep.utils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.583 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.part /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.706 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.part /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.converted" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.712 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.786 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.788 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.806 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.807 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.820 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.891 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.892 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6,backing_fmt=raw /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.933 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6,backing_fmt=raw /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.934 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "840763c52fc7ef035fd3f982d581e13eeabd4dd6" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.934 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'migration_context' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.962 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.965 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start _get_guest_xml network_info=[{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:86:d7:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'usb', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '7497a8b9-5d02-4413-9b0c-140c8fc32c90', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.965 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.987 2 WARNING nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.993 2 DEBUG nova.virt.libvirt.host [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:35:48 compute-0 nova_compute[192810]: 2025-09-30 21:35:48.993 2 DEBUG nova.virt.libvirt.host [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.002 2 DEBUG nova.virt.libvirt.host [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.003 2 DEBUG nova.virt.libvirt.host [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.004 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.005 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.005 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.005 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.006 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.006 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.007 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.007 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.007 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.008 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.008 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.008 2 DEBUG nova.virt.hardware [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.009 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'vcpu_model' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.027 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.084 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.085 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.085 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.086 2 DEBUG oslo_concurrency.lockutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.088 2 DEBUG nova.virt.libvirt.vif [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-522911333',display_name='tempest-ServerStableDeviceRescueTest-server-522911333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-522911333',id=102,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:35:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-glkb8fbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:35:34Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=128d5729-9e60-43d9-b1a4-fa8fb6e0e619,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:86:d7:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.088 2 DEBUG nova.network.os_vif_util [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "vif_mac": "fa:16:3e:86:d7:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.089 2 DEBUG nova.network.os_vif_util [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.091 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'pci_devices' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.109 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <uuid>128d5729-9e60-43d9-b1a4-fa8fb6e0e619</uuid>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <name>instance-00000066</name>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerStableDeviceRescueTest-server-522911333</nova:name>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:35:48</nova:creationTime>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:user uuid="8b1ebef014c145cbbe1e367bfd2c2ba3">tempest-ServerStableDeviceRescueTest-1939201844-project-member</nova:user>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:project uuid="8978d2df88a5434c8794b659033cca5e">tempest-ServerStableDeviceRescueTest-1939201844</nova:project>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         <nova:port uuid="b275e55b-7c62-4b31-a65b-129241b9139d">
Sep 30 21:35:49 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <system>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="serial">128d5729-9e60-43d9-b1a4-fa8fb6e0e619</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="uuid">128d5729-9e60-43d9-b1a4-fa8fb6e0e619</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </system>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <os>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </os>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <features>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </features>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <target dev="sdb" bus="usb"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <boot order="1"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:86:d7:f5"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <target dev="tapb275e55b-7c"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/console.log" append="off"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <video>
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </video>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:35:49 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:35:49 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:35:49 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:35:49 compute-0 nova_compute[192810]: </domain>
Sep 30 21:35:49 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.119 2 INFO nova.virt.libvirt.driver [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance destroyed successfully.
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.179 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.180 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.181 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.181 2 DEBUG nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] No VIF found with MAC fa:16:3e:86:d7:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.182 2 INFO nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Using config drive
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.204 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'ec2_ids' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.265 2 DEBUG nova.objects.instance [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'keypairs' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.782 2 INFO nova.virt.libvirt.driver [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Creating config drive at /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config.rescue
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.789 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0f8x_1gl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.914 2 DEBUG oslo_concurrency.processutils [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0f8x_1gl" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:49 compute-0 kernel: tapb275e55b-7c: entered promiscuous mode
Sep 30 21:35:49 compute-0 NetworkManager[51733]: <info>  [1759268149.9787] manager: (tapb275e55b-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/180)
Sep 30 21:35:49 compute-0 ovn_controller[94912]: 2025-09-30T21:35:49Z|00393|binding|INFO|Claiming lport b275e55b-7c62-4b31-a65b-129241b9139d for this chassis.
Sep 30 21:35:49 compute-0 ovn_controller[94912]: 2025-09-30T21:35:49Z|00394|binding|INFO|b275e55b-7c62-4b31-a65b-129241b9139d: Claiming fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:35:49 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:49.991 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '5', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:49.992 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:35:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:49.993 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:49 compute-0 ovn_controller[94912]: 2025-09-30T21:35:49Z|00395|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d ovn-installed in OVS
Sep 30 21:35:49 compute-0 ovn_controller[94912]: 2025-09-30T21:35:49Z|00396|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d up in Southbound
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:49.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.009 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7a725f26-597e-4474-b717-4b3711f5ece4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 systemd-udevd[236248]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:50 compute-0 systemd-machined[152794]: New machine qemu-49-instance-00000066.
Sep 30 21:35:50 compute-0 NetworkManager[51733]: <info>  [1759268150.0208] device (tapb275e55b-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:50 compute-0 NetworkManager[51733]: <info>  [1759268150.0223] device (tapb275e55b-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:50 compute-0 systemd[1]: Started Virtual Machine qemu-49-instance-00000066.
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.037 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cadb7c9f-60dc-4f7f-bf98-52b3ece93789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.040 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3717e6a8-5cbd-456e-ba78-ab12f46ed85e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.069 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4c44373a-aeb3-46ef-b4f1-6b8269e76a9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.088 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed35376b-39f7-409a-b0cc-af14765175a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 916, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 916, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236260, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.108 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a218ae-f7ab-4412-8efc-c7e5897f9e81]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236262, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236262, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.110 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.115 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.115 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.116 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:50.116 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:50 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 234344
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.793 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.794 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268150.7928643, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.795 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Resumed (Lifecycle Event)
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.821 2 DEBUG nova.compute.manager [None req-4617c2ef-c756-4da6-b835-d0417bd40a37 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.824 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.833 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.887 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.887 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268150.794454, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.887 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Started (Lifecycle Event)
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.918 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:50 compute-0 nova_compute[192810]: 2025-09-30 21:35:50.923 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:52 compute-0 nova_compute[192810]: 2025-09-30 21:35:52.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:52 compute-0 nova_compute[192810]: 2025-09-30 21:35:52.787 2 INFO nova.compute.manager [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Unrescuing
Sep 30 21:35:52 compute-0 nova_compute[192810]: 2025-09-30 21:35:52.788 2 DEBUG oslo_concurrency.lockutils [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:52 compute-0 nova_compute[192810]: 2025-09-30 21:35:52.788 2 DEBUG oslo_concurrency.lockutils [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquired lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:52 compute-0 nova_compute[192810]: 2025-09-30 21:35:52.788 2 DEBUG nova.network.neutron [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.814 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.900 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.958 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:53 compute-0 nova_compute[192810]: 2025-09-30 21:35:53.959 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.018 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.019 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.073 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.074 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.126 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.rescue --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.131 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.183 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.185 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.244 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.413 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.415 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5517MB free_disk=73.1916618347168GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.415 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.416 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.508 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 4255e358-6db2-4947-a3d6-4e045f9235fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.509 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.509 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.509 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.567 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.579 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.599 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:35:54 compute-0 nova_compute[192810]: 2025-09-30 21:35:54.600 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.311 2 DEBUG nova.compute.manager [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.311 2 DEBUG oslo_concurrency.lockutils [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.311 2 DEBUG oslo_concurrency.lockutils [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.312 2 DEBUG oslo_concurrency.lockutils [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.312 2 DEBUG nova.compute.manager [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.312 2 WARNING nova.compute.manager [req-c6cba577-d6de-4a70-933c-155dc2b40dc6 req-dd5e8614-bc79-4357-9588-294a34c1b00b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:55 compute-0 podman[236290]: 2025-09-30 21:35:55.320505771 +0000 UTC m=+0.056054782 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:35:55 compute-0 podman[236291]: 2025-09-30 21:35:55.327674437 +0000 UTC m=+0.061894746 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:35:55 compute-0 podman[236289]: 2025-09-30 21:35:55.375242571 +0000 UTC m=+0.104311684 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.600 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.601 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.911 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.912 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.912 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.937 2 DEBUG nova.network.neutron [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updating instance_info_cache with network_info: [{"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.971 2 DEBUG oslo_concurrency.lockutils [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Releasing lock "refresh_cache-128d5729-9e60-43d9-b1a4-fa8fb6e0e619" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:55 compute-0 nova_compute[192810]: 2025-09-30 21:35:55.971 2 DEBUG nova.objects.instance [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'flavor' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:56 compute-0 kernel: tapb275e55b-7c (unregistering): left promiscuous mode
Sep 30 21:35:56 compute-0 NetworkManager[51733]: <info>  [1759268156.0482] device (tapb275e55b-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00397|binding|INFO|Releasing lport b275e55b-7c62-4b31-a65b-129241b9139d from this chassis (sb_readonly=0)
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00398|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d down in Southbound
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00399|binding|INFO|Removing iface tapb275e55b-7c ovn-installed in OVS
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.069 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.071 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.073 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.086 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[41b3fc9a-0580-427c-8ae5-15a16b9ba3ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000066.scope: Deactivated successfully.
Sep 30 21:35:56 compute-0 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d00000066.scope: Consumed 6.016s CPU time.
Sep 30 21:35:56 compute-0 systemd-machined[152794]: Machine qemu-49-instance-00000066 terminated.
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.115 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7f27f1-e114-4dfb-9234-508f7e2d000f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.117 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d73cd54f-1ada-4de7-a6b7-d90d1002dd42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.146 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5a7adc-f7f5-465d-9475-559dee2ba0d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.166 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe7b509-06e7-4642-adcc-25a47b40d1ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 916, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 916, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236366, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.180 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7d575852-2cf7-49a6-9202-4af1fc69e4f8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236367, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236367, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.181 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.187 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.188 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.188 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.188 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.324 2 INFO nova.virt.libvirt.driver [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance destroyed successfully.
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.324 2 DEBUG nova.objects.instance [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'numa_topology' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:35:56 compute-0 kernel: tapb275e55b-7c: entered promiscuous mode
Sep 30 21:35:56 compute-0 NetworkManager[51733]: <info>  [1759268156.4681] manager: (tapb275e55b-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Sep 30 21:35:56 compute-0 systemd-udevd[236358]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00400|binding|INFO|Claiming lport b275e55b-7c62-4b31-a65b-129241b9139d for this chassis.
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00401|binding|INFO|b275e55b-7c62-4b31-a65b-129241b9139d: Claiming fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 NetworkManager[51733]: <info>  [1759268156.4781] device (tapb275e55b-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:35:56 compute-0 NetworkManager[51733]: <info>  [1759268156.4792] device (tapb275e55b-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.478 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '6', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.479 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 bound to our chassis
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.480 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00402|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d ovn-installed in OVS
Sep 30 21:35:56 compute-0 ovn_controller[94912]: 2025-09-30T21:35:56Z|00403|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d up in Southbound
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.496 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[270ad970-2aba-47c0-8ccc-3f6d5ed6e2cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 systemd-machined[152794]: New machine qemu-50-instance-00000066.
Sep 30 21:35:56 compute-0 systemd[1]: Started Virtual Machine qemu-50-instance-00000066.
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.529 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0a2bce-5b5a-4de3-a891-2d0c55d80375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.533 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b0dab745-0d73-4ce0-916c-7d464ba2cb15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.573 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a9b95d48-236e-4483-a9de-027d938ad031]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.597 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[80fea88f-4651-4ee7-8e7b-a192b06dd2e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 916, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 916, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236416, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.618 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4c182c1c-ed66-4e87-ad6b-e9734e188070]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236418, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236418, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.620 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 nova_compute[192810]: 2025-09-30 21:35:56.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.625 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.625 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.625 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:35:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:35:56.626 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.537 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.538 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.538 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.539 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.539 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.539 2 WARNING nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.539 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.539 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.540 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.540 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.540 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.540 2 WARNING nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.540 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.541 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.541 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.541 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.541 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.542 2 WARNING nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.542 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.542 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.542 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.542 2 DEBUG oslo_concurrency.lockutils [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.543 2 DEBUG nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.543 2 WARNING nova.compute.manager [req-296f44bc-01e2-4dac-9c79-4ca833a0de8a req-ea603fff-1d2e-49f8-b242-c29fc6ecff9a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.618 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.618 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268157.6182842, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.618 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Resumed (Lifecycle Event)
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.621 2 DEBUG nova.compute.manager [None req-b81d218a-5141-4c1d-bd91-9f057d93f759 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.644 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.647 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.678 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] During sync_power_state the instance has a pending task (unrescuing). Skip.
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.679 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268157.6203291, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.679 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Started (Lifecycle Event)
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.700 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.703 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:35:57 compute-0 nova_compute[192810]: 2025-09-30 21:35:57.975 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [{"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.015 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-4255e358-6db2-4947-a3d6-4e045f9235fb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.015 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.015 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.812 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:35:58 compute-0 nova_compute[192810]: 2025-09-30 21:35:58.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.734 2 DEBUG nova.compute.manager [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.735 2 DEBUG oslo_concurrency.lockutils [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.735 2 DEBUG oslo_concurrency.lockutils [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.736 2 DEBUG oslo_concurrency.lockutils [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.737 2 DEBUG nova.compute.manager [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:35:59 compute-0 nova_compute[192810]: 2025-09-30 21:35:59.737 2 WARNING nova.compute.manager [req-470b4a1f-bd61-4762-b88b-4e706c009e89 req-38813aaa-4298-4d48-9aea-ba2160ce9dae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state active and task_state None.
Sep 30 21:36:02 compute-0 nova_compute[192810]: 2025-09-30 21:36:02.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:03 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:49876 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 21:36:03 compute-0 nova_compute[192810]: 2025-09-30 21:36:03.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.539 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.540 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.566 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.745 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.746 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.753 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.754 2 INFO nova.compute.claims [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:36:05 compute-0 nova_compute[192810]: 2025-09-30 21:36:05.992 2 DEBUG nova.compute.provider_tree [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.031 2 DEBUG nova.scheduler.client.report [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.067 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.067 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.219 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.219 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.234 2 INFO nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.280 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.595 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.597 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.598 2 INFO nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Creating image(s)
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.599 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.599 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.600 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.611 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.663 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.664 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.665 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.678 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.734 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.735 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.753 2 DEBUG nova.policy [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.769 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.770 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.770 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.824 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.825 2 DEBUG nova.virt.disk.api [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Checking if we can resize image /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.826 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:06 compute-0 sshd-session[236427]: Invalid user kid from 45.81.23.80 port 52414
Sep 30 21:36:06 compute-0 sshd-session[236427]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:36:06 compute-0 sshd-session[236427]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.888 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.889 2 DEBUG nova.virt.disk.api [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Cannot resize image /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.889 2 DEBUG nova.objects.instance [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'migration_context' on Instance uuid fc7fafcc-8033-4e17-b63e-a79fced7ec46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:06 compute-0 podman[236439]: 2025-09-30 21:36:06.905621829 +0000 UTC m=+0.060527962 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:36:06 compute-0 podman[236443]: 2025-09-30 21:36:06.905892555 +0000 UTC m=+0.057093768 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.912 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.912 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Ensure instance console log exists: /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.912 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.913 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:06 compute-0 nova_compute[192810]: 2025-09-30 21:36:06.913 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:07 compute-0 nova_compute[192810]: 2025-09-30 21:36:07.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-0 nova_compute[192810]: 2025-09-30 21:36:08.354 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Successfully created port: e0da584f-ea7a-4ac9-a63b-aa5731a67762 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:36:08 compute-0 sshd-session[236427]: Failed password for invalid user kid from 45.81.23.80 port 52414 ssh2
Sep 30 21:36:08 compute-0 nova_compute[192810]: 2025-09-30 21:36:08.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:08 compute-0 ovn_controller[94912]: 2025-09-30T21:36:08Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:36:08 compute-0 ovn_controller[94912]: 2025-09-30T21:36:08Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:d7:f5 10.100.0.10
Sep 30 21:36:09 compute-0 sshd-session[236427]: Received disconnect from 45.81.23.80 port 52414:11: Bye Bye [preauth]
Sep 30 21:36:09 compute-0 sshd-session[236427]: Disconnected from invalid user kid 45.81.23.80 port 52414 [preauth]
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.593 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Successfully updated port: e0da584f-ea7a-4ac9-a63b-aa5731a67762 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.644 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.644 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquired lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.645 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.736 2 DEBUG nova.compute.manager [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-changed-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.736 2 DEBUG nova.compute.manager [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Refreshing instance network info cache due to event network-changed-e0da584f-ea7a-4ac9-a63b-aa5731a67762. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.737 2 DEBUG oslo_concurrency.lockutils [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:09 compute-0 nova_compute[192810]: 2025-09-30 21:36:09.915 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.354 2 DEBUG nova.network.neutron [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Updating instance_info_cache with network_info: [{"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.403 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Releasing lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.403 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Instance network_info: |[{"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.404 2 DEBUG oslo_concurrency.lockutils [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.404 2 DEBUG nova.network.neutron [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Refreshing network info cache for port e0da584f-ea7a-4ac9-a63b-aa5731a67762 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.407 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Start _get_guest_xml network_info=[{"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '29834554-3ec3-4459-bfde-932aa778e979'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.413 2 WARNING nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.418 2 DEBUG nova.virt.libvirt.host [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.419 2 DEBUG nova.virt.libvirt.host [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.429 2 DEBUG nova.virt.libvirt.host [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.430 2 DEBUG nova.virt.libvirt.host [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.431 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.431 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.432 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.432 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.432 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.432 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.433 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.433 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.433 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.433 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.433 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.434 2 DEBUG nova.virt.hardware [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.438 2 DEBUG nova.virt.libvirt.vif [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1640893469',display_name='tempest-ListServerFiltersTestJSON-instance-1640893469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1640893469',id=107,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-gwck4dy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:06Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=fc7fafcc-8033-4e17-b63e-a79fced7ec46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.438 2 DEBUG nova.network.os_vif_util [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.439 2 DEBUG nova.network.os_vif_util [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.439 2 DEBUG nova.objects.instance [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'pci_devices' on Instance uuid fc7fafcc-8033-4e17-b63e-a79fced7ec46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.465 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <uuid>fc7fafcc-8033-4e17-b63e-a79fced7ec46</uuid>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <name>instance-0000006b</name>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:name>tempest-ListServerFiltersTestJSON-instance-1640893469</nova:name>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:36:11</nova:creationTime>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:user uuid="3746d13787f042a1bfad4de0c42015eb">tempest-ListServerFiltersTestJSON-1322408077-project-member</nova:user>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:project uuid="17bd9c2628a94a0b83c4cae3f51b3f7c">tempest-ListServerFiltersTestJSON-1322408077</nova:project>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         <nova:port uuid="e0da584f-ea7a-4ac9-a63b-aa5731a67762">
Sep 30 21:36:11 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <system>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="serial">fc7fafcc-8033-4e17-b63e-a79fced7ec46</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="uuid">fc7fafcc-8033-4e17-b63e-a79fced7ec46</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </system>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <os>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </os>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <features>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </features>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.config"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:6c:d3:b3"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <target dev="tape0da584f-ea"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/console.log" append="off"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <video>
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </video>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:36:11 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:36:11 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:36:11 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:36:11 compute-0 nova_compute[192810]: </domain>
Sep 30 21:36:11 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.467 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Preparing to wait for external event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.467 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.467 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.468 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.468 2 DEBUG nova.virt.libvirt.vif [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1640893469',display_name='tempest-ListServerFiltersTestJSON-instance-1640893469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1640893469',id=107,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-gwck4dy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:06Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=fc7fafcc-8033-4e17-b63e-a79fced7ec46,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.469 2 DEBUG nova.network.os_vif_util [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.469 2 DEBUG nova.network.os_vif_util [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.470 2 DEBUG os_vif [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0da584f-ea, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0da584f-ea, col_values=(('external_ids', {'iface-id': 'e0da584f-ea7a-4ac9-a63b-aa5731a67762', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:d3:b3', 'vm-uuid': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-0 NetworkManager[51733]: <info>  [1759268171.4766] manager: (tape0da584f-ea): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/182)
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.484 2 INFO os_vif [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea')
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.552 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.553 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.553 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] No VIF found with MAC fa:16:3e:6c:d3:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:36:11 compute-0 nova_compute[192810]: 2025-09-30 21:36:11.553 2 INFO nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Using config drive
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.692 2 INFO nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Creating config drive at /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.config
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.697 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav9___do execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.823 2 DEBUG oslo_concurrency.processutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpav9___do" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:12 compute-0 kernel: tape0da584f-ea: entered promiscuous mode
Sep 30 21:36:12 compute-0 NetworkManager[51733]: <info>  [1759268172.8750] manager: (tape0da584f-ea): new Tun device (/org/freedesktop/NetworkManager/Devices/183)
Sep 30 21:36:12 compute-0 ovn_controller[94912]: 2025-09-30T21:36:12Z|00404|binding|INFO|Claiming lport e0da584f-ea7a-4ac9-a63b-aa5731a67762 for this chassis.
Sep 30 21:36:12 compute-0 ovn_controller[94912]: 2025-09-30T21:36:12Z|00405|binding|INFO|e0da584f-ea7a-4ac9-a63b-aa5731a67762: Claiming fa:16:3e:6c:d3:b3 10.100.0.13
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.886 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:d3:b3 10.100.0.13'], port_security=['fa:16:3e:6c:d3:b3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e0da584f-ea7a-4ac9-a63b-aa5731a67762) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.887 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e0da584f-ea7a-4ac9-a63b-aa5731a67762 in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 bound to our chassis
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.888 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.901 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9c543079-f8b4-4110-8a5b-74a76f2d9597]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.902 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20111f98-b1 in ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.903 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20111f98-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.903 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ced24e-b552-4871-8b51-d9d1153cfe65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.905 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e7127e7a-1557-4fd5-9b5f-3558f2b16a8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 systemd-udevd[236514]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.917 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[c687b6ae-ed1a-4e91-8fc3-86cbcf9574c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 NetworkManager[51733]: <info>  [1759268172.9213] device (tape0da584f-ea): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:12 compute-0 NetworkManager[51733]: <info>  [1759268172.9219] device (tape0da584f-ea): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:12 compute-0 systemd-machined[152794]: New machine qemu-51-instance-0000006b.
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:12 compute-0 ovn_controller[94912]: 2025-09-30T21:36:12Z|00406|binding|INFO|Setting lport e0da584f-ea7a-4ac9-a63b-aa5731a67762 ovn-installed in OVS
Sep 30 21:36:12 compute-0 ovn_controller[94912]: 2025-09-30T21:36:12Z|00407|binding|INFO|Setting lport e0da584f-ea7a-4ac9-a63b-aa5731a67762 up in Southbound
Sep 30 21:36:12 compute-0 nova_compute[192810]: 2025-09-30 21:36:12.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:12 compute-0 systemd[1]: Started Virtual Machine qemu-51-instance-0000006b.
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.945 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b8cb5a05-e2f0-4162-9275-406405a2f7a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.976 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[373a47b5-68e1-435e-9ea6-0ebade6c4eab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:12.981 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8480213c-da5e-4cb2-857e-6430875926c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:12 compute-0 NetworkManager[51733]: <info>  [1759268172.9828] manager: (tap20111f98-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/184)
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.007 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[56c54cf9-711e-46b1-bbf5-609a2fd19543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.012 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc57938-9405-49e0-973b-fb45d14bd4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 NetworkManager[51733]: <info>  [1759268173.0337] device (tap20111f98-b0): carrier: link connected
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.040 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7915986f-43e3-4200-9df0-b2fa498bdd42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.062 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[45a14a36-4595-474f-bb2f-801c6aad50d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486865, 'reachable_time': 23087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236548, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.076 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c60633d9-ab0b-41c2-bb01-9e83a64700e3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:efd2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486865, 'tstamp': 486865}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236549, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.096 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[54d64f20-751f-44a5-bbff-4cf7d98ba421]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20111f98-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:ef:d2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 121], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486865, 'reachable_time': 23087, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236550, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.131 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[96a4d51d-6782-4c8c-aa3f-826209a0ca23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.202 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8f7098af-3600-47dc-b9ff-b3f3dbf8446f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.203 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.203 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.203 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20111f98-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:13 compute-0 NetworkManager[51733]: <info>  [1759268173.2499] manager: (tap20111f98-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Sep 30 21:36:13 compute-0 kernel: tap20111f98-b0: entered promiscuous mode
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.254 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20111f98-b0, col_values=(('external_ids', {'iface-id': '36b3f8fd-6b0e-45c8-9c31-56cd0812c366'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:13 compute-0 ovn_controller[94912]: 2025-09-30T21:36:13Z|00408|binding|INFO|Releasing lport 36b3f8fd-6b0e-45c8-9c31-56cd0812c366 from this chassis (sb_readonly=0)
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.267 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.268 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2de42074-769b-4df6-a442-3df40b8e45b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.269 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/20111f98-bf7f-4696-b726-3e06c68cfed2.pid.haproxy
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 20111f98-bf7f-4696-b726-3e06c68cfed2
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:36:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:13.269 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'env', 'PROCESS_TAG=haproxy-20111f98-bf7f-4696-b726-3e06c68cfed2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20111f98-bf7f-4696-b726-3e06c68cfed2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:36:13 compute-0 podman[236589]: 2025-09-30 21:36:13.597109048 +0000 UTC m=+0.020190475 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:36:13 compute-0 podman[236589]: 2025-09-30 21:36:13.738471767 +0000 UTC m=+0.161553184 container create e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.790 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268173.7897282, fc7fafcc-8033-4e17-b63e-a79fced7ec46 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.791 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] VM Started (Lifecycle Event)
Sep 30 21:36:13 compute-0 systemd[1]: Started libpod-conmon-e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42.scope.
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.811 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.817 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268173.7898426, fc7fafcc-8033-4e17-b63e-a79fced7ec46 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.817 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] VM Paused (Lifecycle Event)
Sep 30 21:36:13 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:36:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c04da0de14cffc9c429da5afcbd8906e60f5543040f156bcda2aeb2203382da3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.843 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.848 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:13 compute-0 podman[236589]: 2025-09-30 21:36:13.852068926 +0000 UTC m=+0.275150363 container init e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:36:13 compute-0 podman[236603]: 2025-09-30 21:36:13.852313222 +0000 UTC m=+0.064332495 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 21:36:13 compute-0 podman[236602]: 2025-09-30 21:36:13.852578369 +0000 UTC m=+0.067796150 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd)
Sep 30 21:36:13 compute-0 podman[236589]: 2025-09-30 21:36:13.857847828 +0000 UTC m=+0.280929235 container start e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:36:13 compute-0 podman[236604]: 2025-09-30 21:36:13.858060473 +0000 UTC m=+0.064874719 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.866 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:13 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [NOTICE]   (236670) : New worker (236673) forked
Sep 30 21:36:13 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [NOTICE]   (236670) : Loading success.
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.943 2 DEBUG nova.network.neutron [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Updated VIF entry in instance network info cache for port e0da584f-ea7a-4ac9-a63b-aa5731a67762. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.944 2 DEBUG nova.network.neutron [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Updating instance_info_cache with network_info: [{"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:13 compute-0 nova_compute[192810]: 2025-09-30 21:36:13.965 2 DEBUG oslo_concurrency.lockutils [req-323fb957-328b-4c80-b4fc-3a35cc37b033 req-55ce33ba-7cd6-4d6a-ae8f-65800dd2adad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-fc7fafcc-8033-4e17-b63e-a79fced7ec46" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.474 2 DEBUG nova.compute.manager [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.475 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.475 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.475 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.475 2 DEBUG nova.compute.manager [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Processing event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 DEBUG nova.compute.manager [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 DEBUG oslo_concurrency.lockutils [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 DEBUG nova.compute.manager [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] No waiting events found dispatching network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.476 2 WARNING nova.compute.manager [req-f49b0ce8-af27-4dc8-b574-7b39294ed060 req-3e9393c2-eaa8-48f9-bbc9-b1a9edd7e0fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received unexpected event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 for instance with vm_state building and task_state spawning.
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.477 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.480 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268175.479953, fc7fafcc-8033-4e17-b63e-a79fced7ec46 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.480 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] VM Resumed (Lifecycle Event)
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.482 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.485 2 INFO nova.virt.libvirt.driver [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Instance spawned successfully.
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.485 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.511 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.517 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.519 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.520 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.520 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.521 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.521 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.521 2 DEBUG nova.virt.libvirt.driver [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.552 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.608 2 INFO nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Took 9.01 seconds to spawn the instance on the hypervisor.
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.608 2 DEBUG nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.696 2 INFO nova.compute.manager [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Took 10.02 seconds to build instance.
Sep 30 21:36:15 compute-0 nova_compute[192810]: 2025-09-30 21:36:15.714 2 DEBUG oslo_concurrency.lockutils [None req-30a91278-59dd-4528-98de-694746fa44fd 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:16 compute-0 nova_compute[192810]: 2025-09-30 21:36:16.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:17 compute-0 nova_compute[192810]: 2025-09-30 21:36:17.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:21 compute-0 nova_compute[192810]: 2025-09-30 21:36:21.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:22 compute-0 nova_compute[192810]: 2025-09-30 21:36:22.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:22.580 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:22.581 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:36:22 compute-0 nova_compute[192810]: 2025-09-30 21:36:22.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:23.583 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:26 compute-0 podman[236692]: 2025-09-30 21:36:26.333439103 +0000 UTC m=+0.055476298 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:36:26 compute-0 podman[236693]: 2025-09-30 21:36:26.374342984 +0000 UTC m=+0.092980556 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:36:26 compute-0 podman[236691]: 2025-09-30 21:36:26.392256842 +0000 UTC m=+0.128714880 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 21:36:26 compute-0 nova_compute[192810]: 2025-09-30 21:36:26.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:27 compute-0 nova_compute[192810]: 2025-09-30 21:36:27.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:28 compute-0 ovn_controller[94912]: 2025-09-30T21:36:28Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:d3:b3 10.100.0.13
Sep 30 21:36:28 compute-0 ovn_controller[94912]: 2025-09-30T21:36:28Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:d3:b3 10.100.0.13
Sep 30 21:36:31 compute-0 nova_compute[192810]: 2025-09-30 21:36:31.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:32 compute-0 nova_compute[192810]: 2025-09-30 21:36:32.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:36 compute-0 nova_compute[192810]: 2025-09-30 21:36:36.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-0 podman[236775]: 2025-09-30 21:36:37.32230003 +0000 UTC m=+0.055610287 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:36:37 compute-0 podman[236776]: 2025-09-30 21:36:37.350640607 +0000 UTC m=+0.085099022 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.582 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.582 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.600 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.720 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.720 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.727 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.728 2 INFO nova.compute.claims [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.906 2 DEBUG nova.compute.provider_tree [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.920 2 DEBUG nova.scheduler.client.report [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.960 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:37 compute-0 nova_compute[192810]: 2025-09-30 21:36:37.961 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.035 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.036 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.056 2 INFO nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.077 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.200 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.201 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.202 2 INFO nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Creating image(s)
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.203 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.203 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.204 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.217 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.271 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.272 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.272 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.283 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.339 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.340 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.370 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.371 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.372 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.427 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.428 2 DEBUG nova.virt.disk.api [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Checking if we can resize image /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.428 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.482 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.483 2 DEBUG nova.virt.disk.api [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Cannot resize image /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.484 2 DEBUG nova.objects.instance [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'migration_context' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.505 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.506 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Ensure instance console log exists: /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.506 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.506 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:38 compute-0 nova_compute[192810]: 2025-09-30 21:36:38.507 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:38.742 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:38.743 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:38.744 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:39 compute-0 nova_compute[192810]: 2025-09-30 21:36:39.191 2 DEBUG nova.policy [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '90078b78b6fc4222ab0254ae1eda2500', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '793bce1316e34dada7414560b74789f4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:36:41 compute-0 nova_compute[192810]: 2025-09-30 21:36:41.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:41 compute-0 nova_compute[192810]: 2025-09-30 21:36:41.857 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Successfully created port: 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.758 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Successfully updated port: 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.778 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.779 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.779 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.890 2 DEBUG nova.compute.manager [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.890 2 DEBUG nova.compute.manager [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing instance network info cache due to event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.890 2 DEBUG oslo_concurrency.lockutils [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:42 compute-0 nova_compute[192810]: 2025-09-30 21:36:42.944 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.761 2 DEBUG nova.network.neutron [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.791 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.792 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance network_info: |[{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.792 2 DEBUG oslo_concurrency.lockutils [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.792 2 DEBUG nova.network.neutron [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.795 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start _get_guest_xml network_info=[{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.799 2 WARNING nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.804 2 DEBUG nova.virt.libvirt.host [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.804 2 DEBUG nova.virt.libvirt.host [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.806 2 DEBUG nova.virt.libvirt.host [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.807 2 DEBUG nova.virt.libvirt.host [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.808 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.808 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.808 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.809 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.809 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.809 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.809 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.810 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.810 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.810 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.810 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.811 2 DEBUG nova.virt.hardware [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.814 2 DEBUG nova.virt.libvirt.vif [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-76631792',display_name='tempest-ServerRescueTestJSONUnderV235-server-76631792',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-76631792',id=110,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793bce1316e34dada7414560b74789f4',ramdisk_id='',reservation_id='r-pjsi62w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-672740030',owner_user_name='tempest-ServerRescueTestJSONUnderV235-672740030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:38Z,user_data=None,user_id='90078b78b6fc4222ab0254ae1eda2500',uuid=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.814 2 DEBUG nova.network.os_vif_util [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converting VIF {"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.815 2 DEBUG nova.network.os_vif_util [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.816 2 DEBUG nova.objects.instance [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.832 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <uuid>01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</uuid>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <name>instance-0000006e</name>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-76631792</nova:name>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:36:43</nova:creationTime>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:user uuid="90078b78b6fc4222ab0254ae1eda2500">tempest-ServerRescueTestJSONUnderV235-672740030-project-member</nova:user>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:project uuid="793bce1316e34dada7414560b74789f4">tempest-ServerRescueTestJSONUnderV235-672740030</nova:project>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         <nova:port uuid="7e8f7bb1-f155-40d6-8e85-f4222da3c027">
Sep 30 21:36:43 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <system>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="serial">01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="uuid">01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </system>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <os>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </os>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <features>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </features>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:86:ba:8b"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <target dev="tap7e8f7bb1-f1"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/console.log" append="off"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <video>
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </video>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:36:43 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:36:43 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:36:43 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:36:43 compute-0 nova_compute[192810]: </domain>
Sep 30 21:36:43 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.834 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Preparing to wait for external event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.834 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.835 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.835 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.835 2 DEBUG nova.virt.libvirt.vif [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-76631792',display_name='tempest-ServerRescueTestJSONUnderV235-server-76631792',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-76631792',id=110,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='793bce1316e34dada7414560b74789f4',ramdisk_id='',reservation_id='r-pjsi62w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSONUnderV235-672740030',owner_user_name='tempest-ServerRescueTestJSONUnderV235-672740030-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:38Z,user_data=None,user_id='90078b78b6fc4222ab0254ae1eda2500',uuid=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.836 2 DEBUG nova.network.os_vif_util [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converting VIF {"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.836 2 DEBUG nova.network.os_vif_util [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.837 2 DEBUG os_vif [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.837 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7e8f7bb1-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7e8f7bb1-f1, col_values=(('external_ids', {'iface-id': '7e8f7bb1-f155-40d6-8e85-f4222da3c027', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:ba:8b', 'vm-uuid': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-0 NetworkManager[51733]: <info>  [1759268203.8437] manager: (tap7e8f7bb1-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.851 2 INFO os_vif [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1')
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.910 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.909 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000066', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8978d2df88a5434c8794b659033cca5e', 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'hostId': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.910 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.910 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No VIF found with MAC fa:16:3e:86:ba:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:36:43 compute-0 nova_compute[192810]: 2025-09-30 21:36:43.911 2 INFO nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Using config drive
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.914 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'hostId': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.916 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6', 'name': 'tempest-ServerRescueTestJSONUnderV235-server-76631792', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000006e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '793bce1316e34dada7414560b74789f4', 'user_id': '90078b78b6fc4222ab0254ae1eda2500', 'hostId': 'b71ada171ffdfe0865a4713cb71ef2a13ace165cd7db5eae91b20a9e', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.918 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000062', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '8978d2df88a5434c8794b659033cca5e', 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'hostId': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.919 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.919 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>]
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:36:43 compute-0 podman[236839]: 2025-09-30 21:36:43.943413239 +0000 UTC m=+0.056443488 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:36:43 compute-0 podman[236838]: 2025-09-30 21:36:43.951649965 +0000 UTC m=+0.063202357 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.951 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/cpu volume: 10910000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.966 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/cpu volume: 11080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.967 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:43 compute-0 podman[236840]: 2025-09-30 21:36:43.977470208 +0000 UTC m=+0.085786899 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.982 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/cpu volume: 11580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a274314-c9e2-4fb0-8aee-87fa85ae4d6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10910000000, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'timestamp': '2025-09-30T21:36:43.919581', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8f1df1cc-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.639215677, 'message_signature': 'cd31a841ed01c59b05053363af5cd06f5d21fc8d38425f4ecb80b4cf6c9ab26b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11080000000, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'timestamp': '2025-09-30T21:36:43.919581', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8f202758-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.653625397, 'message_signature': '93561eafa9db3bd2e8047d4c259d59b318c3993acc67e50ced3c95e4cdee4348'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11580000000, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'timestamp': '2025-09-30T21:36:43.919581', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8f22a424-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.670055036, 'message_signature': '12848a4f0117e6eb55bf13a9756245ff559541c8992cece9e95fba73ef7b2380'}]}, 'timestamp': '2025-09-30 21:36:43.983156', '_unique_id': '57d0ca98ebe84eb5b732bcd0f2ec87a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:43.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.004 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.bytes volume: 319488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.004 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.020 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.bytes volume: 72880128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.020 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.021 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.037 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.bytes volume: 413696 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.037 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f34ead72-4540-4699-a6e3-839938c76050', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 319488, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f25f8ae-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '3c031300015e843241558994ae09e00860e0d5e7e9548bef620c33b47bc55b20'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2602c2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '1acad7ea424b318330af12c0127b9783dab361476978ead09d0acf96044d8566'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72880128, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f285c2a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '6e546b55b40bd8ee6a61e074964055e232654e1689114a8b686cc299833ced4c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f28659e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': 'e5865d79a9d883ab472c3f8b98af82f58a10d82dd2c49c6bbee6d6b5d123a241'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 413696, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2afce6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '04f28a2058bef83bcc6bac7d1c38a4cc6620cab1c851f5dfec3552cc93e0a238'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:43.985106', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 1
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 28, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2b07a4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '2b6e8ee58c1624874dcb96ea50554d9aa0c9b1c35b104741030351ee3504bf46'}]}, 'timestamp': '2025-09-30 21:36:44.038039', '_unique_id': '7805500322af462f983e798784c05fda'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.041 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 / tapb275e55b-7c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.042 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.043 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for fc7fafcc-8033-4e17-b63e-a79fced7ec46 / tape0da584f-ea inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.044 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.044 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.046 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4255e358-6db2-4947-a3d6-4e045f9235fb / tapb0db3e99-94 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.046 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '630a20cb-0f36-441c-bfac-ba24dc3cb112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.039764', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f2bae34-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': 'f62f7a8fc867e426a9e52f27edded40efc8dec3836caa1e74cd731f5c2599f30'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.039764', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f2bfdf8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': 'cc7165eff6c6c16f18c8d5f3c485fde14ec48519bc30e48d622017feefce35c8'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.039764', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f2c6338-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '8cd5932b91d124face623b7b5348dcbd7949e646e370a423c858067c318ae55f'}]}, 'timestamp': '2025-09-30 21:36:44.046947', '_unique_id': '6e9d85c1d9db4bdf87d0bf04f9b45b20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.047 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.048 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.048 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.048 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.048 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.049 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.049 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.049 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62551792-bd38-4a6d-9073-e649b8dc2a81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.048846', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f2cb6bc-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': '559aa074dadb729506a41fa156f2aa5ceaa94b5eda88503cdfed183333fddd51'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.048846', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f2cbf40-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': '8f873a06b5eb481aba07c822d261ce93cdb6166ce522db3e24fc382a0a2644d7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.048846', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f2ce1aa-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '774d19dd60304df87d0a65cc159c60a1f4cb88381e4e8a832f43628e9a3417aa'}]}, 'timestamp': '2025-09-30 21:36:44.050173', '_unique_id': '4612f619824d431d90e51960f2a92778'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.038 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.050 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.051 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.051 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.051 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.052 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.052 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.outgoing.bytes volume: 1620 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41877f76-d054-40e5-b9ce-a29c19c6cacf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.051495', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f2d207a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': 'cc39e9449a2fd57e61014b034ecf0b579d67ba1c6443fa8b2f6ac8985d2cb77e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.051495', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f2d28f4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': '6eaa85d31790ea4919dfc67d96b088437ea974e0da289792126dbccf5180a080'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1620, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.051495', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f2d4cf8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '65714048a4e99d5cdb84206769a917925b608aeaad13068ec21b5a6cc5cb52cc'}]}, 'timestamp': '2025-09-30 21:36:44.052920', '_unique_id': 'd87f49852b8f45c0b9036ddc59c9284d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.053 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.054 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.054 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.054 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.055 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.055 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5b36688-3468-459a-914a-24455314b8bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.054250', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f2d89ac-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': 'ba1929d092a0472923d9523ec0a4f245dec5f9a247d06a9529dd70de16519cf7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.054250', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f2d94ba-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': 'aa03e974713087a79b60ef28ba395104c06b1ba7bd391f50f6d3ed3d9b773a38'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.054250', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f2db418-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': 'a3c6547164bdc50e888742bc9d7ce1ddf6efda2d80816c2cd920763ec7fcca05'}]}, 'timestamp': '2025-09-30 21:36:44.055618', '_unique_id': '315e5ec098fe4381a59d4e2442394d94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.056 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.057 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.057 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/memory.usage volume: 42.22265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.057 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/memory.usage volume: 46.73046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/memory.usage volume: 42.22265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34048e32-f02e-4f93-af40-31f6a7a44376', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.22265625, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'timestamp': '2025-09-30T21:36:44.057087', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8f2df874-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.639215677, 'message_signature': 'fe27bbfbf1fc8b4f0d6ea50d96a8902eb50fa101c52210e80fa9162f6c0c16a5'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.73046875, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'timestamp': '2025-09-30T21:36:44.057087', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8f2e0198-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.653625397, 'message_signature': 'd72a5ed124439e10d497c6bc5d36f973248ca64e056af7043e68183d63f17e47'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.22265625, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'timestamp': '2025-09-30T21:36:44.057087', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '8f2e20d8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.670055036, 'message_signature': '4debb13843d1a966c6f7add1c6c98e6734b145cebd4f87f9729e920ac3ebab37'}]}, 'timestamp': '2025-09-30 21:36:44.058335', '_unique_id': '01850a6f4c9b4d7fa2fe4abbf5dd9519'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.058 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.059 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.059 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.latency volume: 835807053 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.059 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.latency volume: 43429510 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.059 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.latency volume: 595740426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.060 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.latency volume: 43278219 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.060 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.061 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.latency volume: 702356286 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.061 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.latency volume: 38993262 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '869372e2-da3f-4e59-b7c0-276d97a98456', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 835807053, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2e568e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '436c0e9c625300c046a5da6238bcdee73bcbbf9277bdbf1cc917f015bc746a58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43429510, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2e60b6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': 'bb4575a318094468eac326bf524381c120e809a2c3db641809dd7ff4286b06de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 595740426, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2e6836-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '746997fda27a4ac848411736334e766dddbfa644c776c62639e6f2277ab20aa3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43278219, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2e6f70-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '5e238d41cfcf0e49643c8eaae7584ecb7534b86c28d57013ab55e618f274338f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 702356286, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f2e936a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': 'e579f69ac6aca5863e40ead3977f7d7330ca4617556ed331582426dd53fcf94b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 38993262, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.059471', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f2e9b08-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '66b97862cba647d3f9891c62283c59b470afa1a783d6a39eaabf7a1b8b4e0f8e'}]}, 'timestamp': '2025-09-30 21:36:44.061492', '_unique_id': '4ed984edf4a340bfae2a6eaff583196f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.062 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.073 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.073 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.082 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.082 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.083 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.094 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.094 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57a3de5c-d7c7-4ad4-808a-1e567d9bcf16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f306960-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': 'c1ce1791c03eadc85d681f469f0efaa08db00d444851d50f057859c1770d7b7b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f307374-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': 'ba70a7d8cc260a5c4a2c08b398db0e9d78cdc54e96d8c6ced6e42ba294e48487'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f31d21e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': 'dc9390bc1e5196a057348dcfcbf53f4a2c7437d1248fcf4d6c5bf245be28e162'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f31dc3c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': '34c21af1bc55375215e520c01799a5e77d2aa9263a43fa9be192a01a09de3b11'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f33a3d2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': '21ca28d2cdfc42224e4f8658d34b0d632cc1d86a623faaf33b4adb11adfa8724'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.063020', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: ': 1, 'disk_name': 'sda'}, 'message_id': '8f33add2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': 'a61417681061817b0d31ffab460f3df3e054008846285e4f016b26f02dc1d4f5'}]}, 'timestamp': '2025-09-30 21:36:44.094732', '_unique_id': '0ece1793ded44d6f858e014d3277fe6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.062 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.096 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.096 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.096 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.097 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.097 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42bb1b56-a5a7-469d-b5a4-9e4c2f69884a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.096441', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f33fb98-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': '235062fd74b1256391a4971005b4cb424a4894c0354c96129f85044d1a84c230'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.096441', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f340494-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': '77607826eb949d02ac46ee27c0668a51162861a038645eda0910581a508acf32'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.096441', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f3426a4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '7aac51ce32d45e4a99a72e7f537012754dfd6695a07eaf698c32abdcce30963e'}]}, 'timestamp': '2025-09-30 21:36:44.097826', '_unique_id': '3459811bc136426c9a204b2c0f7c7d6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.098 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.099 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.099 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.099 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.099 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.bytes volume: 30648832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.099 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.100 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.100 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.bytes volume: 32057344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.100 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e2bb007-d3bd-4733-b04d-f3e95d8773c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f346146-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '4200242fc63de63816750706e314ccee196050f93ab6b78c7ed49a17238bb6d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3469de-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '07478214f46a6fad4b270d4f7bde6bd1115556e407f85fc562de434a70b67935'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30648832, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f34723a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': 'dd6467762210a5072f2951b765e3c94ed60da1ad79546ff5aa6b9aa596fd8333'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f347974-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '6efadd987e3f470a86338262e32211fe8a01612dd41940427bedac1ad0a275e8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32057344, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3498c8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '8c45353154899d605b349dd50d765507a8e8bf918e1eb0318bbb0d99eee7ab01'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.099092', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, '
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f34a0ca-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '9a9c517f16972746318f266313f76750c72f077e7367008bf05b43e09fe7a75e'}]}, 'timestamp': '2025-09-30 21:36:44.100938', '_unique_id': '6151326ce2df45a698233325205b76d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.095 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.102 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.102 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.allocation volume: 30744576 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.102 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.102 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.allocation volume: 30154752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.103 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.103 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.103 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '004f1a7f-81d2-4ea0-aa4a-cb01c327f1b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30744576, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f34e0a8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': '615514714b83da6592808e34efe7791e0fac7f0fefc2d15ec09b70dfd8aa7e7e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f34eb2a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': '02bd0aab5fde54d1910a3d2bf7a9c421c7a2e2f3f7def7bfc9be2e9149038b56'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30154752, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f34f34a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': '30efd1f16bce71ad2a0788f30d568d900949a102d895acd0d17d62ab82246020'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f34fc32-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': '9b2c97052c05979654f4eab9e6d37af30b50224de9134d502fa6adcb493523b3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f35192e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': '4dacfe2aaf2b5e2f4384434034c869a561a01d4d8cc65c32f36b9bbcfac10ede'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.102347', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f352180-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': '296a2712a1920d9acc573eed63228fa4d1738c6cc182abcc9b10d810ba3bdfcd'}]}, 'timestamp': '2025-09-30 21:36:44.104235', '_unique_id': '75a0d2e83967444f9a8e373f6745a816'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.105 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.106 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.106 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775614f1-77a6-45f2-952b-17992a60c27e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.105940', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f356dc0-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': 'ebb330caa3a21996fad35aa8fc62b3c10bb648cae8c12f488e8643bff8859a9c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.105940', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f35769e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': 'e6fe0a13058e167b2db332cbea1967132d4ca52fe70580d91f44fe179fc7f9b7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.105940', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f35970a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '79c3f649c120cc7e4a1122f2fd15fd698e6fd48a30f676bfd9d8f0f4c4fc8916'}]}, 'timestamp': '2025-09-30 21:36:44.107247', '_unique_id': '801f37c75185482da59dbe23238a33e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.107 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.108 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.108 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.109 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.109 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ae44844-17f0-48e8-835d-e42cfc28ae16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.108710', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f35d95e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': '9df7e69184cfcf2e3472677fde7b46a6f41c6cc9bd1b6f236b8272da54053b47'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.108710', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f35e250-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': 'f1e5cf891aa34e5bbf04f02234e5d3d95c8461a6d114d3bd09234ff110023393'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.108710', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f35fe84-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '7999a86bdaf4a04ed9baa167c9f206653a54928d206242ef0f8078a9256424ac'}]}, 'timestamp': '2025-09-30 21:36:44.109899', '_unique_id': '8f75259d2c524ac89ec20c641ec202d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.101 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.110 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.111 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.requests volume: 37 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.111 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.111 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.requests volume: 311 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.111 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.112 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.112 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.requests volume: 50 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.112 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd68c4450-bb9b-460e-946e-25d0c3d672d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 37, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f3636ba-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': 'b7a1b861706ca7e0ec9aa0deb5645439b28bd14ed8e2d5be29c941ea15929f44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f363f34-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '9833684acabedca66fb905598f396c48ed397aca228e18cb493ae53a3e13806e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 311, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f36495c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': 'b01644e82abc3963054569babafabdeadaffd6e15b46087e08097060f9f7bfdf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f365230-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '3a2e652905b3ee50b639c59852764eb8110f798c95c05be51e55d8b2d3c3330c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 50, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f367512-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': 'e1855aec15e42dfd1c6cca7b295559b47c6450a9aa424bb2ce30a75995b3ba0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.111113', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: _type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f367d32-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': 'd36e6a5629e1b6bf31aca205cd03ea41853729a5392117487baef454baebc4f2'}]}, 'timestamp': '2025-09-30 21:36:44.113136', '_unique_id': 'a46677d6636f45979d865e2b52da0d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.114 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.latency volume: 75136353 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.114 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.115 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.latency volume: 3445677288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.115 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.115 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.115 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.latency volume: 65350337 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.104 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a44991be-aa48-45f6-9078-525a1633c1fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 75136353, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f36bca2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '08f4d70550b1b75a560cd890bda778da632fd06a4e3e6c340093ce8561b6beef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f36c6de-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': 'e7c4c7a54a8516f17778b487eaff1a43985d53c8feeb6c34fe09a7baa4f95b1c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3445677288, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f36cf3a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': 'e4322f66a0f912dcb69cef9282e0197fe50d7484df350c69102166d2246325e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f36d642-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': 'c7880203879985335484aa8246bc4e75e9578fa78a5c1fc69461e7b4988b2224'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 65350337, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f36f226-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': 'c590115795ce683a24df09a1ef7aabb565f7d2d1bf64d8976b4354aca9bfd5ad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.114473', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'v
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: cpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f36fa00-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': 'd7bacfd63b07643e6ab07149bf7f7f62f550c8a11c49047e43f8f1059c6f9dbe'}]}, 'timestamp': '2025-09-30 21:36:44.116337', '_unique_id': 'a7adc8f4d4e74ff1bfb3369bb0cfef06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.117 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.117 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.117 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.118 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.118 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.118 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.119 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.119 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb9ac17e-4011-48a8-8835-18ee6fe989b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f37374a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': '0c2d21eb529cfa99898bfcfba70af3617c7ab48dee6e4a21a4c4814f0b9ad986'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3741f4-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.750513652, 'message_signature': '3d0513d8fc7c0ad0970da7fc86b9d292a77a1b2477366678e3f7e82f38c30d3f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f374a3c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': 'ca5414c70eaa3e6ad71d569e82adab3c4c143fac998ebda069ff9bf1a79096e1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3752b6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.761097735, 'message_signature': '272e2659f6e018f6314f6a2e3ac81e6ee4a94f09c31342a2189617ec20031653'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f376ec2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': 'b39e05d2c531458e989adbb59137a331827651537421efc87b66ec71bceebca5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.117656', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ep
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: hemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f3777e6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.770956401, 'message_signature': '029316b6b8263c5ee030de12639661865fb4555562d90a1071d0cda46dce0a30'}]}, 'timestamp': '2025-09-30 21:36:44.119557', '_unique_id': '9245bc91b0524bdd9055ccd889527ead'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.120 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.121 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.121 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.requests volume: 1101 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.121 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.122 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.122 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.requests volume: 1213 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.122 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb757922-3900-4a3c-9fe3-1b879112ad46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-vda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f37b6a2-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': '27d05d1e19c226827cb86253b753fbdc72ddd9aab1d0b44b243394c214919f55'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619-sda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'instance-00000066', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f37be04-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.67260131, 'message_signature': 'f5d94625d1d0c16a3dc799b5e2223c3da158e72c4498ee593bb71ee99fa6c456'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1101, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-vda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f37c552-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '9a013e9a28631e3a276ccf42f996d2b17e63657e3a890e3349089858ff43324c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46-sda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'instance-0000006b', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f37ce08-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.692618909, 'message_signature': '6b2430572bf24867a85c27d2b0fdf42c8fc15a5f4e588329f0012ac46746e158'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1213, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-vda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8f37ec94-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '4fb49736224ca3c762918fa76fc2eeff1c42c8ac95001940785cbced80b4d345'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': '4255e358-6db2-4947-a3d6-4e045f9235fb-sda', 'timestamp': '2025-09-30T21:36:44.120938', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'instance-00000062', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64'
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: , 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8f37f59a-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.709183361, 'message_signature': '4497b502a51d02d853d53d356e853e2a3082239202f024e886832286c585f374'}]}, 'timestamp': '2025-09-30 21:36:44.122765', '_unique_id': '1b4ece03943f4187b6f5bb7cf93850b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.124 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.124 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-522911333>, <NovaLikeServer: tempest-ListServerFiltersTestJSON-instance-1640893469>, <NovaLikeServer: tempest-ServerRescueTestJSONUnderV235-server-76631792>, <NovaLikeServer: tempest-ServerStableDeviceRescueTest-server-475541027>]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.124 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.124 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.incoming.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.125 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.125 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.incoming.packets volume: 23 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac9a917d-9eaa-479e-899a-1dbc06ab07c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.124295', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f383a6e-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': '78b654b50a97842a8ba556ab8bd76ebd690cda2285bc772266da23d3e7be275c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.124295', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f384428-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': '28d00f4e14b3d522db10cc6bad3d85ffa457fce92d1422e84a7198cde93235ee'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 23, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.124295', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f385f1c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': 'dae101849a459d10404f3b74a967b1aa1caba45f445164ad2bce8be2cca9f81b'}]}, 'timestamp': '2025-09-30 21:36:44.125473', '_unique_id': '660dfccdfc4942c3bec8399cc2d59856'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.126 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.incoming.bytes volume: 1646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.127 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.incoming.bytes volume: 1646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.127 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.127 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.incoming.bytes volume: 2156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b24489e-d9c5-4b3c-9196-054a18a51171', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1646, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.126848', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f389df6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': 'fede1c8ce11b5115472ae410b813ae681a6b3775469a6d65998c062c23d00182'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1646, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.126848', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f38a79c-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': '8fefc795b8964a99a4a45c9263dab2491ec6e53c866d220ed70f0d3761ef89f4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2156, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.126848', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f38c7b8-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '00b688206e382940bad5d77d748e5551808df5111998fd4cdbc849152015531c'}]}, 'timestamp': '2025-09-30 21:36:44.128168', '_unique_id': '38543565ade543a88fd6e784c13b1c88'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.128 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.113 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.129 12 DEBUG ceilometer.compute.pollsters [-] 128d5729-9e60-43d9-b1a4-fa8fb6e0e619/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.129 12 DEBUG ceilometer.compute.pollsters [-] fc7fafcc-8033-4e17-b63e-a79fced7ec46/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.130 12 DEBUG ceilometer.compute.pollsters [-] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000006e, id=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.130 12 DEBUG ceilometer.compute.pollsters [-] 4255e358-6db2-4947-a3d6-4e045f9235fb/network.outgoing.packets volume: 16 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '106bc754-3f46-45fe-b3e8-a409be054315', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000066-128d5729-9e60-43d9-b1a4-fa8fb6e0e619-tapb275e55b-7c', 'timestamp': '2025-09-30T21:36:44.129446', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-522911333', 'name': 'tapb275e55b-7c', 'instance_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:86:d7:f5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb275e55b-7c'}, 'message_id': '8f3903fe-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.727277562, 'message_signature': '189cf87bb85a1a99fdba23d6c20d076682dd900f883160d2a395c00ec1531b6d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '3746d13787f042a1bfad4de0c42015eb', 'user_name': None, 'project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'project_name': None, 'resource_id': 'instance-0000006b-fc7fafcc-8033-4e17-b63e-a79fced7ec46-tape0da584f-ea', 'timestamp': '2025-09-30T21:36:44.129446', 'resource_metadata': {'display_name': 'tempest-ListServerFiltersTestJSON-instance-1640893469', 'name': 'tape0da584f-ea', 'instance_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'instance_type': 'm1.nano', 'host': '824a0c93d5696b22c2af81ed178d68b0a97bf635bd13c66c760315cd', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '29834554-3ec3-4459-bfde-932aa778e979'}, 'image_ref': '29834554-3ec3-4459-bfde-932aa778e979', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6c:d3:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0da584f-ea'}, 'message_id': '8f390bf6-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.729794345, 'message_signature': 'b56dc13e75dbffd45cc9e0550286aa366c9f87caad1b9d51a960f0889b21b467'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 16, 'user_id': '8b1ebef014c145cbbe1e367bfd2c2ba3', 'user_name': None, 'project_id': '8978d2df88a5434c8794b659033cca5e', 'project_name': None, 'resource_id': 'instance-00000062-4255e358-6db2-4947-a3d6-4e045f9235fb-tapb0db3e99-94', 'timestamp': '2025-09-30T21:36:44.129446', 'resource_metadata': {'display_name': 'tempest-ServerStableDeviceRescueTest-server-475541027', 'name': 'tapb0db3e99-94', 'instance_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'instance_type': 'm1.nano', 'host': '6e1e2adb3fbd052d0d2dc058fdd214e1c9ba4141d9c27a6a95074a38', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:03:b5:6d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db3e99-94'}, 'message_id': '8f392910-9e45-11f0-a153-fa163e09b122', 'monotonic_time': 4899.732483762, 'message_signature': '8fb438e6365c049d42eb089cc2ab20a1394873b8d7f6d1d8176239165233cd67'}]}, 'timestamp': '2025-09-30 21:36:44.130659', '_unique_id': 'b24fa6b43753477eb8583ea9faa4ff5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:36:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:36:44.131 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.116 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.120 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:44 compute-0 rsyslogd[1007]: message too long (8192) with configured size 8096, begin of message is: 2025-09-30 21:36:44.123 12 ERROR oslo_messaging.notify.messaging [-] Could not s [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2445 ]
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.033 2 INFO nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Creating config drive at /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.038 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphu6ne7in execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.164 2 DEBUG oslo_concurrency.processutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphu6ne7in" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:45 compute-0 kernel: tap7e8f7bb1-f1: entered promiscuous mode
Sep 30 21:36:45 compute-0 NetworkManager[51733]: <info>  [1759268205.2310] manager: (tap7e8f7bb1-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/187)
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00409|binding|INFO|Claiming lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 for this chassis.
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00410|binding|INFO|7e8f7bb1-f155-40d6-8e85-f4222da3c027: Claiming fa:16:3e:86:ba:8b 10.100.0.7
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:45.243 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ba:8b 10.100.0.7'], port_security=['fa:16:3e:86:ba:8b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f659d19-7261-4309-81bc-b7bb9a9da219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793bce1316e34dada7414560b74789f4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5fd3b1da-f9d4-4128-be3a-89511bb35b84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63360a12-7489-4e6b-8985-1b0f556f3468, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7e8f7bb1-f155-40d6-8e85-f4222da3c027) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:45.244 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 in datapath 6f659d19-7261-4309-81bc-b7bb9a9da219 bound to our chassis
Sep 30 21:36:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:45.245 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6f659d19-7261-4309-81bc-b7bb9a9da219 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:36:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:45.246 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f1412695-726b-4779-bf90-f2443bfc1a90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:45 compute-0 systemd-udevd[236914]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:36:45 compute-0 NetworkManager[51733]: <info>  [1759268205.2840] device (tap7e8f7bb1-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:36:45 compute-0 NetworkManager[51733]: <info>  [1759268205.2850] device (tap7e8f7bb1-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:36:45 compute-0 systemd-machined[152794]: New machine qemu-52-instance-0000006e.
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00411|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 ovn-installed in OVS
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00412|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 up in Southbound
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:45 compute-0 systemd[1]: Started Virtual Machine qemu-52-instance-0000006e.
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00413|binding|INFO|Releasing lport 10034ee7-d74d-45f3-b835-201b62e1bcd6 from this chassis (sb_readonly=0)
Sep 30 21:36:45 compute-0 ovn_controller[94912]: 2025-09-30T21:36:45Z|00414|binding|INFO|Releasing lport 36b3f8fd-6b0e-45c8-9c31-56cd0812c366 from this chassis (sb_readonly=0)
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.844 2 DEBUG nova.compute.manager [req-aa26e21e-805e-4abf-b533-87e1f38b4c19 req-a53e4831-d050-41bb-8d35-0335970840a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.845 2 DEBUG oslo_concurrency.lockutils [req-aa26e21e-805e-4abf-b533-87e1f38b4c19 req-a53e4831-d050-41bb-8d35-0335970840a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.845 2 DEBUG oslo_concurrency.lockutils [req-aa26e21e-805e-4abf-b533-87e1f38b4c19 req-a53e4831-d050-41bb-8d35-0335970840a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.846 2 DEBUG oslo_concurrency.lockutils [req-aa26e21e-805e-4abf-b533-87e1f38b4c19 req-a53e4831-d050-41bb-8d35-0335970840a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:45 compute-0 nova_compute[192810]: 2025-09-30 21:36:45.846 2 DEBUG nova.compute.manager [req-aa26e21e-805e-4abf-b533-87e1f38b4c19 req-a53e4831-d050-41bb-8d35-0335970840a2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Processing event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.019 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268206.0195358, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.020 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Started (Lifecycle Event)
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.022 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.025 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.029 2 INFO nova.virt.libvirt.driver [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance spawned successfully.
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.029 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.036 2 DEBUG nova.network.neutron [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated VIF entry in instance network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.037 2 DEBUG nova.network.neutron [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.058 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.060 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.065 2 DEBUG oslo_concurrency.lockutils [req-4059b6d7-152a-4376-8ee4-35d5b6e09519 req-4d6d0b12-ff55-4c15-8de2-462305d3ad6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.068 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.069 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.069 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.069 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.070 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.070 2 DEBUG nova.virt.libvirt.driver [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.099 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.099 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268206.0214505, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.099 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Paused (Lifecycle Event)
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.152 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.156 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268206.023864, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.156 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Resumed (Lifecycle Event)
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.192 2 INFO nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Took 7.99 seconds to spawn the instance on the hypervisor.
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.192 2 DEBUG nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.194 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.199 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.241 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.324 2 INFO nova.compute.manager [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Took 8.64 seconds to build instance.
Sep 30 21:36:46 compute-0 nova_compute[192810]: 2025-09-30 21:36:46.400 2 DEBUG oslo_concurrency.lockutils [None req-4abc620f-c088-4b35-b8f8-d00fa2ea5a77 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.697 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.698 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.698 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.698 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.699 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.712 2 INFO nova.compute.manager [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Terminating instance
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.725 2 DEBUG nova.compute.manager [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:36:47 compute-0 kernel: tapb275e55b-7c (unregistering): left promiscuous mode
Sep 30 21:36:47 compute-0 NetworkManager[51733]: <info>  [1759268207.7542] device (tapb275e55b-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:47 compute-0 ovn_controller[94912]: 2025-09-30T21:36:47Z|00415|binding|INFO|Releasing lport b275e55b-7c62-4b31-a65b-129241b9139d from this chassis (sb_readonly=0)
Sep 30 21:36:47 compute-0 ovn_controller[94912]: 2025-09-30T21:36:47Z|00416|binding|INFO|Setting lport b275e55b-7c62-4b31-a65b-129241b9139d down in Southbound
Sep 30 21:36:47 compute-0 ovn_controller[94912]: 2025-09-30T21:36:47Z|00417|binding|INFO|Removing iface tapb275e55b-7c ovn-installed in OVS
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.773 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:d7:f5 10.100.0.10'], port_security=['fa:16:3e:86:d7:f5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '128d5729-9e60-43d9-b1a4-fa8fb6e0e619', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b275e55b-7c62-4b31-a65b-129241b9139d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.774 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b275e55b-7c62-4b31-a65b-129241b9139d in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.776 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f5a6396a-b7b7-4ff1-a2af-27477fea2815
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.794 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eee4f966-cf3d-4cc4-bb4d-b7bd5fa9af9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000066.scope: Deactivated successfully.
Sep 30 21:36:47 compute-0 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000066.scope: Consumed 13.667s CPU time.
Sep 30 21:36:47 compute-0 systemd-machined[152794]: Machine qemu-50-instance-00000066 terminated.
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.830 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3179d11d-d0c6-432f-8008-a09d888a0d3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.834 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[550d6e29-8386-4726-94ae-87987ca28924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.860 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c72e1203-787d-4bbe-9245-866cc98b1257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.878 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3f58f3-4d24-432e-b566-ed34ada1d1fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf5a6396a-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:66:d5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 1000, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 113], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480957, 'reachable_time': 18506, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236940, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.895 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a484be87-1181-41c5-a253-76ae1e07b1e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480967, 'tstamp': 480967}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236941, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf5a6396a-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480970, 'tstamp': 480970}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236941, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.897 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.904 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf5a6396a-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.904 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.904 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf5a6396a-b0, col_values=(('external_ids', {'iface-id': '10034ee7-d74d-45f3-b835-201b62e1bcd6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:47.905 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.942 2 DEBUG nova.compute.manager [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.942 2 DEBUG oslo_concurrency.lockutils [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.942 2 DEBUG oslo_concurrency.lockutils [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.942 2 DEBUG oslo_concurrency.lockutils [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.943 2 DEBUG nova.compute.manager [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.943 2 WARNING nova.compute.manager [req-4ec55ae9-8030-4e14-90b7-94a63de21b6d req-306bc70f-a012-4f23-a753-fd0dde825cda dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state active and task_state None.
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.989 2 INFO nova.virt.libvirt.driver [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Instance destroyed successfully.
Sep 30 21:36:47 compute-0 nova_compute[192810]: 2025-09-30 21:36:47.990 2 DEBUG nova.objects.instance [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.008 2 DEBUG nova.virt.libvirt.vif [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:35:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-522911333',display_name='tempest-ServerStableDeviceRescueTest-server-522911333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-522911333',id=102,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:35:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-glkb8fbj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:35:57Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=128d5729-9e60-43d9-b1a4-fa8fb6e0e619,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.008 2 DEBUG nova.network.os_vif_util [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b275e55b-7c62-4b31-a65b-129241b9139d", "address": "fa:16:3e:86:d7:f5", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb275e55b-7c", "ovs_interfaceid": "b275e55b-7c62-4b31-a65b-129241b9139d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.009 2 DEBUG nova.network.os_vif_util [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.009 2 DEBUG os_vif [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.011 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb275e55b-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.015 2 INFO os_vif [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:d7:f5,bridge_name='br-int',has_traffic_filtering=True,id=b275e55b-7c62-4b31-a65b-129241b9139d,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb275e55b-7c')
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.015 2 INFO nova.virt.libvirt.driver [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Deleting instance files /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619_del
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.016 2 INFO nova.virt.libvirt.driver [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Deletion of /var/lib/nova/instances/128d5729-9e60-43d9-b1a4-fa8fb6e0e619_del complete
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.143 2 INFO nova.compute.manager [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.144 2 DEBUG oslo.service.loopingcall [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.144 2 DEBUG nova.compute.manager [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.144 2 DEBUG nova.network.neutron [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:48 compute-0 nova_compute[192810]: 2025-09-30 21:36:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.098 2 INFO nova.compute.manager [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Rescuing
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.099 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.099 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.099 2 DEBUG nova.network.neutron [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.183 2 DEBUG nova.network.neutron [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.202 2 INFO nova.compute.manager [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Took 1.06 seconds to deallocate network for instance.
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.281 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.282 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.444 2 DEBUG nova.compute.provider_tree [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.460 2 DEBUG nova.scheduler.client.report [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.481 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.507 2 INFO nova.scheduler.client.report [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Deleted allocations for instance 128d5729-9e60-43d9-b1a4-fa8fb6e0e619
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.577 2 DEBUG oslo_concurrency.lockutils [None req-0a82b8f7-307e-47e8-836b-ffc97c92db26 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.880s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.615 2 DEBUG nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.616 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.616 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.616 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.617 2 DEBUG nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.617 2 WARNING nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-unplugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state deleted and task_state None.
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.617 2 DEBUG nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.617 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.618 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.618 2 DEBUG oslo_concurrency.lockutils [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "128d5729-9e60-43d9-b1a4-fa8fb6e0e619-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.618 2 DEBUG nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] No waiting events found dispatching network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.619 2 WARNING nova.compute.manager [req-5b326be0-7031-41a5-bb29-800584a85085 req-b95d7e4d-0014-42d4-9d88-3bf2ec43ece2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received unexpected event network-vif-plugged-b275e55b-7c62-4b31-a65b-129241b9139d for instance with vm_state deleted and task_state None.
Sep 30 21:36:49 compute-0 nova_compute[192810]: 2025-09-30 21:36:49.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.026 2 DEBUG nova.compute.manager [req-6fc8a741-4949-435d-b101-79d0755f54a5 req-732c5347-0fa6-4f4a-8480-1f5d26bc0a2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Received event network-vif-deleted-b275e55b-7c62-4b31-a65b-129241b9139d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.480 2 DEBUG nova.network.neutron [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.503 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.807 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.808 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.809 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.809 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.809 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.820 2 INFO nova.compute.manager [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Terminating instance
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.830 2 DEBUG nova.compute.manager [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:36:50 compute-0 kernel: tape0da584f-ea (unregistering): left promiscuous mode
Sep 30 21:36:50 compute-0 NetworkManager[51733]: <info>  [1759268210.8608] device (tape0da584f-ea): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:50 compute-0 ovn_controller[94912]: 2025-09-30T21:36:50Z|00418|binding|INFO|Releasing lport e0da584f-ea7a-4ac9-a63b-aa5731a67762 from this chassis (sb_readonly=0)
Sep 30 21:36:50 compute-0 ovn_controller[94912]: 2025-09-30T21:36:50Z|00419|binding|INFO|Setting lport e0da584f-ea7a-4ac9-a63b-aa5731a67762 down in Southbound
Sep 30 21:36:50 compute-0 ovn_controller[94912]: 2025-09-30T21:36:50Z|00420|binding|INFO|Removing iface tape0da584f-ea ovn-installed in OVS
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.889 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:36:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:50.894 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:d3:b3 10.100.0.13'], port_security=['fa:16:3e:6c:d3:b3 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fc7fafcc-8033-4e17-b63e-a79fced7ec46', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20111f98-bf7f-4696-b726-3e06c68cfed2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '17bd9c2628a94a0b83c4cae3f51b3f7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6a3ecea0-9346-40cc-9dbe-25cd68fe08ed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4af2c14-351e-4037-bacd-dca3cfee62e9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e0da584f-ea7a-4ac9-a63b-aa5731a67762) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:50.896 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e0da584f-ea7a-4ac9-a63b-aa5731a67762 in datapath 20111f98-bf7f-4696-b726-3e06c68cfed2 unbound from our chassis
Sep 30 21:36:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:50.897 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20111f98-bf7f-4696-b726-3e06c68cfed2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:50 compute-0 nova_compute[192810]: 2025-09-30 21:36:50.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:50.898 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0455213b-6fea-4d88-b1ce-9bb2403f9a47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:50.902 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 namespace which is not needed anymore
Sep 30 21:36:50 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006b.scope: Deactivated successfully.
Sep 30 21:36:50 compute-0 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d0000006b.scope: Consumed 13.188s CPU time.
Sep 30 21:36:50 compute-0 systemd-machined[152794]: Machine qemu-51-instance-0000006b terminated.
Sep 30 21:36:51 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [NOTICE]   (236670) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:51 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [NOTICE]   (236670) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:51 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [WARNING]  (236670) : Exiting Master process...
Sep 30 21:36:51 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [ALERT]    (236670) : Current worker (236673) exited with code 143 (Terminated)
Sep 30 21:36:51 compute-0 neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2[236634]: [WARNING]  (236670) : All workers exited. Exiting... (0)
Sep 30 21:36:51 compute-0 systemd[1]: libpod-e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42.scope: Deactivated successfully.
Sep 30 21:36:51 compute-0 podman[236984]: 2025-09-30 21:36:51.058438839 +0000 UTC m=+0.059307619 container died e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.081 2 INFO nova.virt.libvirt.driver [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Instance destroyed successfully.
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.082 2 DEBUG nova.objects.instance [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lazy-loading 'resources' on Instance uuid fc7fafcc-8033-4e17-b63e-a79fced7ec46 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-c04da0de14cffc9c429da5afcbd8906e60f5543040f156bcda2aeb2203382da3-merged.mount: Deactivated successfully.
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.108 2 DEBUG nova.virt.libvirt.vif [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1640893469',display_name='tempest-ListServerFiltersTestJSON-instance-1640893469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1640893469',id=107,image_ref='29834554-3ec3-4459-bfde-932aa778e979',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='17bd9c2628a94a0b83c4cae3f51b3f7c',ramdisk_id='',reservation_id='r-gwck4dy6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='29834554-3ec3-4459-bfde-932aa778e979',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1322408077',owner_user_name='tempest-ListServerFiltersTestJSON-1322408077-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:36:15Z,user_data=None,user_id='3746d13787f042a1bfad4de0c42015eb',uuid=fc7fafcc-8033-4e17-b63e-a79fced7ec46,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.109 2 DEBUG nova.network.os_vif_util [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converting VIF {"id": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "address": "fa:16:3e:6c:d3:b3", "network": {"id": "20111f98-bf7f-4696-b726-3e06c68cfed2", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-2086275832-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "17bd9c2628a94a0b83c4cae3f51b3f7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0da584f-ea", "ovs_interfaceid": "e0da584f-ea7a-4ac9-a63b-aa5731a67762", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.110 2 DEBUG nova.network.os_vif_util [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.110 2 DEBUG os_vif [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.111 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0da584f-ea, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:51 compute-0 podman[236984]: 2025-09-30 21:36:51.17203934 +0000 UTC m=+0.172908110 container cleanup e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.172 2 INFO os_vif [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:d3:b3,bridge_name='br-int',has_traffic_filtering=True,id=e0da584f-ea7a-4ac9-a63b-aa5731a67762,network=Network(20111f98-bf7f-4696-b726-3e06c68cfed2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0da584f-ea')
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.173 2 INFO nova.virt.libvirt.driver [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Deleting instance files /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46_del
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.173 2 INFO nova.virt.libvirt.driver [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Deletion of /var/lib/nova/instances/fc7fafcc-8033-4e17-b63e-a79fced7ec46_del complete
Sep 30 21:36:51 compute-0 systemd[1]: libpod-conmon-e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42.scope: Deactivated successfully.
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.213 2 DEBUG nova.compute.manager [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-unplugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.214 2 DEBUG oslo_concurrency.lockutils [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.214 2 DEBUG oslo_concurrency.lockutils [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.214 2 DEBUG oslo_concurrency.lockutils [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.214 2 DEBUG nova.compute.manager [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] No waiting events found dispatching network-vif-unplugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.215 2 DEBUG nova.compute.manager [req-08c26236-6e32-4d42-9ade-d775e25dc076 req-9edcf6bf-2e24-42eb-b9e4-4a4a4ab9ebd0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-unplugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:36:51 compute-0 podman[237028]: 2025-09-30 21:36:51.233912883 +0000 UTC m=+0.035867925 container remove e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.240 2 INFO nova.compute.manager [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.240 2 DEBUG oslo.service.loopingcall [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.241 2 DEBUG nova.compute.manager [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.240 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8a03b2dd-168c-42ae-a7a5-8fbbc1c891b2]: (4, ('Tue Sep 30 09:36:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42)\ne7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42\nTue Sep 30 09:36:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 (e7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42)\ne7ba9ac7b207cd9454304e660e393f35206cc9378e8f7c83183e4ab12f662e42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.241 2 DEBUG nova.network.neutron [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.242 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9e79f368-a76f-4489-aeaa-24b2d97d1cb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.243 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20111f98-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:51 compute-0 kernel: tap20111f98-b0: left promiscuous mode
Sep 30 21:36:51 compute-0 nova_compute[192810]: 2025-09-30 21:36:51.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.260 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6f71eec1-2b26-4958-9d18-a83bfcaa900f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.278 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fecd13ff-c12e-4b3c-ad35-7f0765cf33d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.279 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3597130d-08c6-456a-91f9-71e42619947b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.295 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7999f0-48b4-4c40-8e6e-401c527f00cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486859, 'reachable_time': 38156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237043, 'error': None, 'target': 'ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:51 compute-0 systemd[1]: run-netns-ovnmeta\x2d20111f98\x2dbf7f\x2d4696\x2db726\x2d3e06c68cfed2.mount: Deactivated successfully.
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.299 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20111f98-bf7f-4696-b726-3e06c68cfed2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:51.299 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5d23e1cf-a551-4bec-9f16-f315d6328874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.256 2 DEBUG nova.network.neutron [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.272 2 INFO nova.compute.manager [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Took 1.03 seconds to deallocate network for instance.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.321 2 DEBUG nova.compute.manager [req-cf760bb8-e810-4fc0-a067-008eee89a8e3 req-5971a53e-e272-48ab-9305-38516a090bfe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-deleted-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.351 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.351 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.379 2 DEBUG nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.402 2 DEBUG nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.403 2 DEBUG nova.compute.provider_tree [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.421 2 DEBUG nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.462 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.462 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.463 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.463 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.463 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.472 2 INFO nova.compute.manager [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Terminating instance
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.482 2 DEBUG nova.compute.manager [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:36:52 compute-0 kernel: tapb0db3e99-94 (unregistering): left promiscuous mode
Sep 30 21:36:52 compute-0 NetworkManager[51733]: <info>  [1759268212.5100] device (tapb0db3e99-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:36:52 compute-0 ovn_controller[94912]: 2025-09-30T21:36:52Z|00421|binding|INFO|Releasing lport b0db3e99-94f0-42a8-a410-081e33fd0dee from this chassis (sb_readonly=0)
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 ovn_controller[94912]: 2025-09-30T21:36:52Z|00422|binding|INFO|Setting lport b0db3e99-94f0-42a8-a410-081e33fd0dee down in Southbound
Sep 30 21:36:52 compute-0 ovn_controller[94912]: 2025-09-30T21:36:52Z|00423|binding|INFO|Removing iface tapb0db3e99-94 ovn-installed in OVS
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.527 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:b5:6d 10.100.0.6'], port_security=['fa:16:3e:03:b5:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4255e358-6db2-4947-a3d6-4e045f9235fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8978d2df88a5434c8794b659033cca5e', 'neutron:revision_number': '8', 'neutron:security_group_ids': '93b1b45c-82db-437e-88d0-4d5f76771b04', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6b1a10b-a890-44de-9d6b-4b24b7ba0344, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=b0db3e99-94f0-42a8-a410-081e33fd0dee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.528 103867 INFO neutron.agent.ovn.metadata.agent [-] Port b0db3e99-94f0-42a8-a410-081e33fd0dee in datapath f5a6396a-b7b7-4ff1-a2af-27477fea2815 unbound from our chassis
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.530 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5a6396a-b7b7-4ff1-a2af-27477fea2815, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.531 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b8e921-94e0-46c1-8428-453d4f9c3312]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.531 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 namespace which is not needed anymore
Sep 30 21:36:52 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000062.scope: Deactivated successfully.
Sep 30 21:36:52 compute-0 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d00000062.scope: Consumed 15.590s CPU time.
Sep 30 21:36:52 compute-0 systemd-machined[152794]: Machine qemu-47-instance-00000062 terminated.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.601 2 DEBUG nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [NOTICE]   (235795) : haproxy version is 2.8.14-c23fe91
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [NOTICE]   (235795) : path to executable is /usr/sbin/haproxy
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [WARNING]  (235795) : Exiting Master process...
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [WARNING]  (235795) : Exiting Master process...
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [ALERT]    (235795) : Current worker (235797) exited with code 143 (Terminated)
Sep 30 21:36:52 compute-0 neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815[235791]: [WARNING]  (235795) : All workers exited. Exiting... (0)
Sep 30 21:36:52 compute-0 systemd[1]: libpod-e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad.scope: Deactivated successfully.
Sep 30 21:36:52 compute-0 podman[237065]: 2025-09-30 21:36:52.684270823 +0000 UTC m=+0.065075413 container died e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.720 2 DEBUG nova.compute.provider_tree [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.744 2 DEBUG nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.751 2 INFO nova.virt.libvirt.driver [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Instance destroyed successfully.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.751 2 DEBUG nova.objects.instance [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lazy-loading 'resources' on Instance uuid 4255e358-6db2-4947-a3d6-4e045f9235fb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.768 2 DEBUG nova.virt.libvirt.vif [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:34:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-475541027',display_name='tempest-ServerStableDeviceRescueTest-server-475541027',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-475541027',id=98,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:35:07Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8978d2df88a5434c8794b659033cca5e',ramdisk_id='',reservation_id='r-1jbnwm8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1939201844',owner_user_name='tempest-ServerStableDeviceRescueTest-1939201844-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:35:14Z,user_data=None,user_id='8b1ebef014c145cbbe1e367bfd2c2ba3',uuid=4255e358-6db2-4947-a3d6-4e045f9235fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.769 2 DEBUG nova.network.os_vif_util [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converting VIF {"id": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "address": "fa:16:3e:03:b5:6d", "network": {"id": "f5a6396a-b7b7-4ff1-a2af-27477fea2815", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1495752671-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8978d2df88a5434c8794b659033cca5e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db3e99-94", "ovs_interfaceid": "b0db3e99-94f0-42a8-a410-081e33fd0dee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.770 2 DEBUG nova.network.os_vif_util [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.770 2 DEBUG os_vif [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.772 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0db3e99-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad-userdata-shm.mount: Deactivated successfully.
Sep 30 21:36:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-f30699ad1b94d270b7acb13d7d964de370b5ee1ea379ad52fff596e94239a7cc-merged.mount: Deactivated successfully.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.777 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.782 2 INFO os_vif [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:b5:6d,bridge_name='br-int',has_traffic_filtering=True,id=b0db3e99-94f0-42a8-a410-081e33fd0dee,network=Network(f5a6396a-b7b7-4ff1-a2af-27477fea2815),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db3e99-94')
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.784 2 INFO nova.virt.libvirt.driver [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Deleting instance files /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb_del
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.785 2 INFO nova.virt.libvirt.driver [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Deletion of /var/lib/nova/instances/4255e358-6db2-4947-a3d6-4e045f9235fb_del complete
Sep 30 21:36:52 compute-0 podman[237065]: 2025-09-30 21:36:52.786670195 +0000 UTC m=+0.167474765 container cleanup e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:52 compute-0 systemd[1]: libpod-conmon-e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad.scope: Deactivated successfully.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.809 2 INFO nova.scheduler.client.report [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Deleted allocations for instance fc7fafcc-8033-4e17-b63e-a79fced7ec46
Sep 30 21:36:52 compute-0 podman[237113]: 2025-09-30 21:36:52.844606579 +0000 UTC m=+0.035329571 container remove e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.851 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[88b78be4-b662-4102-b065-1760f15c58aa]: (4, ('Tue Sep 30 09:36:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad)\ne6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad\nTue Sep 30 09:36:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 (e6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad)\ne6a1859221fb154e59c5d16b5f8786764eebf83d35b259ced80e5ce0e24bd5ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[848cfe22-0737-4729-aeab-ba61e68df40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.857 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf5a6396a-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 kernel: tapf5a6396a-b0: left promiscuous mode
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.874 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a483e24b-2ef2-4414-9cad-bccd1586e0b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.885 2 INFO nova.compute.manager [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.886 2 DEBUG oslo.service.loopingcall [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.886 2 DEBUG nova.compute.manager [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.886 2 DEBUG nova.network.neutron [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.903 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8b280acb-9e4d-43ed-885d-cf533ff3faab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.904 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[176d075b-1d1f-4015-b525-4f25bc6df8f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 nova_compute[192810]: 2025-09-30 21:36:52.908 2 DEBUG oslo_concurrency.lockutils [None req-6c4118af-2166-4c47-ab24-3b84136b89c2 3746d13787f042a1bfad4de0c42015eb 17bd9c2628a94a0b83c4cae3f51b3f7c - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.918 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f67af111-4c17-4a4f-bba5-3ee89e04123f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480949, 'reachable_time': 32755, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237128, 'error': None, 'target': 'ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:52 compute-0 systemd[1]: run-netns-ovnmeta\x2df5a6396a\x2db7b7\x2d4ff1\x2da2af\x2d27477fea2815.mount: Deactivated successfully.
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.921 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f5a6396a-b7b7-4ff1-a2af-27477fea2815 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:36:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:36:52.921 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[70b32e8d-c711-47f7-8308-eb477ff787dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.312 2 DEBUG nova.compute.manager [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.313 2 DEBUG oslo_concurrency.lockutils [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.313 2 DEBUG oslo_concurrency.lockutils [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.313 2 DEBUG oslo_concurrency.lockutils [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fc7fafcc-8033-4e17-b63e-a79fced7ec46-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.313 2 DEBUG nova.compute.manager [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] No waiting events found dispatching network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.313 2 WARNING nova.compute.manager [req-8bc1554b-4d4f-461f-9e72-f194e03a852b req-e58d145f-64b4-4b1e-8b8a-640815d0fce0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Received unexpected event network-vif-plugged-e0da584f-ea7a-4ac9-a63b-aa5731a67762 for instance with vm_state deleted and task_state None.
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.630 2 DEBUG nova.network.neutron [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.644 2 INFO nova.compute.manager [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Took 0.76 seconds to deallocate network for instance.
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.722 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.722 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.794 2 DEBUG nova.compute.provider_tree [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.816 2 DEBUG nova.scheduler.client.report [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.840 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.890 2 INFO nova.scheduler.client.report [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Deleted allocations for instance 4255e358-6db2-4947-a3d6-4e045f9235fb
Sep 30 21:36:53 compute-0 nova_compute[192810]: 2025-09-30 21:36:53.978 2 DEBUG oslo_concurrency.lockutils [None req-eee7ac45-6505-4b9f-85e7-972d5baa1023 8b1ebef014c145cbbe1e367bfd2c2ba3 8978d2df88a5434c8794b659033cca5e - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.539 2 DEBUG nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.539 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.539 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.540 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.540 2 DEBUG nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.540 2 WARNING nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-unplugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state deleted and task_state None.
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.540 2 DEBUG nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.541 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.541 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.541 2 DEBUG oslo_concurrency.lockutils [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4255e358-6db2-4947-a3d6-4e045f9235fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.541 2 DEBUG nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] No waiting events found dispatching network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.542 2 WARNING nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received unexpected event network-vif-plugged-b0db3e99-94f0-42a8-a410-081e33fd0dee for instance with vm_state deleted and task_state None.
Sep 30 21:36:54 compute-0 nova_compute[192810]: 2025-09-30 21:36:54.542 2 DEBUG nova.compute.manager [req-a33f81cd-dff0-4c57-8402-c84568ebe2f3 req-ababc11c-d531-4ce9-9fb3-ab9a067a13b6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Received event network-vif-deleted-b0db3e99-94f0-42a8-a410-081e33fd0dee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.804 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.804 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.804 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:36:55 compute-0 nova_compute[192810]: 2025-09-30 21:36:55.804 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:36:57 compute-0 podman[237141]: 2025-09-30 21:36:57.346276482 +0000 UTC m=+0.075511103 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:36:57 compute-0 podman[237140]: 2025-09-30 21:36:57.380359972 +0000 UTC m=+0.103110271 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:36:57 compute-0 podman[237139]: 2025-09-30 21:36:57.380621948 +0000 UTC m=+0.104220148 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:36:57 compute-0 nova_compute[192810]: 2025-09-30 21:36:57.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:57 compute-0 nova_compute[192810]: 2025-09-30 21:36:57.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.851 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.875 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.875 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.875 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.876 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.899 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.900 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.900 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.900 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:36:58 compute-0 nova_compute[192810]: 2025-09-30 21:36:58.974 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.059 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.060 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.116 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.241 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.243 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5550MB free_disk=73.22149658203125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.243 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.243 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.317 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.317 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.318 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.378 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.479 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.498 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:36:59 compute-0 nova_compute[192810]: 2025-09-30 21:36:59.499 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:00 compute-0 nova_compute[192810]: 2025-09-30 21:37:00.410 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:00 compute-0 nova_compute[192810]: 2025-09-30 21:37:00.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:00 compute-0 nova_compute[192810]: 2025-09-30 21:37:00.933 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:37:02 compute-0 sshd-session[237212]: Invalid user vpn from 45.81.23.80 port 47260
Sep 30 21:37:02 compute-0 sshd-session[237212]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:37:02 compute-0 sshd-session[237212]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:37:02 compute-0 nova_compute[192810]: 2025-09-30 21:37:02.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:02 compute-0 nova_compute[192810]: 2025-09-30 21:37:02.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:02 compute-0 nova_compute[192810]: 2025-09-30 21:37:02.988 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268207.9871042, 128d5729-9e60-43d9-b1a4-fa8fb6e0e619 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:02 compute-0 nova_compute[192810]: 2025-09-30 21:37:02.989 2 INFO nova.compute.manager [-] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] VM Stopped (Lifecycle Event)
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.015 2 DEBUG nova.compute.manager [None req-8239e2b7-1fe9-46a2-997e-92d44f4a1670 - - - - - -] [instance: 128d5729-9e60-43d9-b1a4-fa8fb6e0e619] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:03 compute-0 kernel: tap7e8f7bb1-f1 (unregistering): left promiscuous mode
Sep 30 21:37:03 compute-0 NetworkManager[51733]: <info>  [1759268223.0692] device (tap7e8f7bb1-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:37:03 compute-0 ovn_controller[94912]: 2025-09-30T21:37:03Z|00424|binding|INFO|Releasing lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 from this chassis (sb_readonly=0)
Sep 30 21:37:03 compute-0 ovn_controller[94912]: 2025-09-30T21:37:03Z|00425|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 down in Southbound
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:03 compute-0 ovn_controller[94912]: 2025-09-30T21:37:03Z|00426|binding|INFO|Removing iface tap7e8f7bb1-f1 ovn-installed in OVS
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:03 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:03.104 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ba:8b 10.100.0.7'], port_security=['fa:16:3e:86:ba:8b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f659d19-7261-4309-81bc-b7bb9a9da219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793bce1316e34dada7414560b74789f4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5fd3b1da-f9d4-4128-be3a-89511bb35b84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63360a12-7489-4e6b-8985-1b0f556f3468, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7e8f7bb1-f155-40d6-8e85-f4222da3c027) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:03 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:03.105 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 in datapath 6f659d19-7261-4309-81bc-b7bb9a9da219 unbound from our chassis
Sep 30 21:37:03 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:03.106 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6f659d19-7261-4309-81bc-b7bb9a9da219 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:37:03 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:03.107 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d4fdaf-0ee7-41a9-b06e-575cc73b5f27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:03 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Sep 30 21:37:03 compute-0 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d0000006e.scope: Consumed 13.083s CPU time.
Sep 30 21:37:03 compute-0 systemd-machined[152794]: Machine qemu-52-instance-0000006e terminated.
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.525 2 DEBUG nova.compute.manager [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.526 2 DEBUG oslo_concurrency.lockutils [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.526 2 DEBUG oslo_concurrency.lockutils [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.527 2 DEBUG oslo_concurrency.lockutils [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.527 2 DEBUG nova.compute.manager [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.527 2 WARNING nova.compute.manager [req-0b25598e-576a-4309-aa11-54ce767a6971 req-d422f7dc-567c-4deb-b4d8-11e305645c2e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state active and task_state rescuing.
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.948 2 INFO nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance shutdown successfully after 13 seconds.
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.956 2 INFO nova.virt.libvirt.driver [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance destroyed successfully.
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.956 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'numa_topology' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.979 2 INFO nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Attempting rescue
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.979 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.983 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.983 2 INFO nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Creating image(s)
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.984 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.984 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.984 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:03 compute-0 nova_compute[192810]: 2025-09-30 21:37:03.985 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.008 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.009 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.019 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:04 compute-0 sshd-session[237212]: Failed password for invalid user vpn from 45.81.23.80 port 47260 ssh2
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.114 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.116 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.228 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.rescue" returned: 0 in 0.112s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.229 2 DEBUG oslo_concurrency.lockutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.229 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'migration_context' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.245 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.246 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start _get_guest_xml network_info=[{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "vif_mac": "fa:16:3e:86:ba:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '86b6907c-d747-4e98-8897-42105915831d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.247 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'resources' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.263 2 WARNING nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.268 2 DEBUG nova.virt.libvirt.host [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.269 2 DEBUG nova.virt.libvirt.host [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.272 2 DEBUG nova.virt.libvirt.host [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.273 2 DEBUG nova.virt.libvirt.host [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.275 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.276 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.277 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.277 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.278 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.278 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.278 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.279 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.279 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.280 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.280 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.281 2 DEBUG nova.virt.hardware [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.281 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.302 2 DEBUG nova.virt.libvirt.vif [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-76631792',display_name='tempest-ServerRescueTestJSONUnderV235-server-76631792',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-76631792',id=110,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:36:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='793bce1316e34dada7414560b74789f4',ramdisk_id='',reservation_id='r-pjsi62w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-672740030',owner_user_name='tempest-ServerRescueTestJSONUnderV235-672740030-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:36:46Z,user_data=None,user_id='90078b78b6fc4222ab0254ae1eda2500',uuid=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "vif_mac": "fa:16:3e:86:ba:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.303 2 DEBUG nova.network.os_vif_util [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converting VIF {"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "vif_mac": "fa:16:3e:86:ba:8b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.305 2 DEBUG nova.network.os_vif_util [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.307 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.321 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <uuid>01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</uuid>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <name>instance-0000006e</name>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerRescueTestJSONUnderV235-server-76631792</nova:name>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:37:04</nova:creationTime>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:user uuid="90078b78b6fc4222ab0254ae1eda2500">tempest-ServerRescueTestJSONUnderV235-672740030-project-member</nova:user>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:project uuid="793bce1316e34dada7414560b74789f4">tempest-ServerRescueTestJSONUnderV235-672740030</nova:project>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         <nova:port uuid="7e8f7bb1-f155-40d6-8e85-f4222da3c027">
Sep 30 21:37:04 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <system>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="serial">01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="uuid">01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </system>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <os>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </os>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <features>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </features>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.rescue"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <target dev="vdb" bus="virtio"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config.rescue"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:86:ba:8b"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <target dev="tap7e8f7bb1-f1"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/console.log" append="off"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <video>
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </video>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:37:04 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:37:04 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:37:04 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:37:04 compute-0 nova_compute[192810]: </domain>
Sep 30 21:37:04 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.333 2 INFO nova.virt.libvirt.driver [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance destroyed successfully.
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.434 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.435 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.435 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.435 2 DEBUG nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] No VIF found with MAC fa:16:3e:86:ba:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.436 2 INFO nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Using config drive
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.461 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:04 compute-0 nova_compute[192810]: 2025-09-30 21:37:04.504 2 DEBUG nova.objects.instance [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'keypairs' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.090 2 INFO nova.virt.libvirt.driver [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Creating config drive at /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config.rescue
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.103 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1sm__bbl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.243 2 DEBUG oslo_concurrency.processutils [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1sm__bbl" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:05 compute-0 kernel: tap7e8f7bb1-f1: entered promiscuous mode
Sep 30 21:37:05 compute-0 systemd-udevd[237218]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:37:05 compute-0 NetworkManager[51733]: <info>  [1759268225.3245] manager: (tap7e8f7bb1-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/188)
Sep 30 21:37:05 compute-0 ovn_controller[94912]: 2025-09-30T21:37:05Z|00427|binding|INFO|Claiming lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 for this chassis.
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:05 compute-0 ovn_controller[94912]: 2025-09-30T21:37:05Z|00428|binding|INFO|7e8f7bb1-f155-40d6-8e85-f4222da3c027: Claiming fa:16:3e:86:ba:8b 10.100.0.7
Sep 30 21:37:05 compute-0 NetworkManager[51733]: <info>  [1759268225.3390] device (tap7e8f7bb1-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:37:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:05.338 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ba:8b 10.100.0.7'], port_security=['fa:16:3e:86:ba:8b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f659d19-7261-4309-81bc-b7bb9a9da219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793bce1316e34dada7414560b74789f4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5fd3b1da-f9d4-4128-be3a-89511bb35b84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63360a12-7489-4e6b-8985-1b0f556f3468, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7e8f7bb1-f155-40d6-8e85-f4222da3c027) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:05.339 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 in datapath 6f659d19-7261-4309-81bc-b7bb9a9da219 bound to our chassis
Sep 30 21:37:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:05.340 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6f659d19-7261-4309-81bc-b7bb9a9da219 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:37:05 compute-0 NetworkManager[51733]: <info>  [1759268225.3412] device (tap7e8f7bb1-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:37:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:05.341 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3e0c1ffd-8dc2-429a-bde2-379d10ac545e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:05 compute-0 ovn_controller[94912]: 2025-09-30T21:37:05Z|00429|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 ovn-installed in OVS
Sep 30 21:37:05 compute-0 ovn_controller[94912]: 2025-09-30T21:37:05Z|00430|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 up in Southbound
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:05 compute-0 systemd-machined[152794]: New machine qemu-53-instance-0000006e.
Sep 30 21:37:05 compute-0 systemd[1]: Started Virtual Machine qemu-53-instance-0000006e.
Sep 30 21:37:05 compute-0 sshd-session[237212]: Received disconnect from 45.81.23.80 port 47260:11: Bye Bye [preauth]
Sep 30 21:37:05 compute-0 sshd-session[237212]: Disconnected from invalid user vpn 45.81.23.80 port 47260 [preauth]
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.712 2 DEBUG nova.compute.manager [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.713 2 DEBUG oslo_concurrency.lockutils [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.713 2 DEBUG oslo_concurrency.lockutils [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.714 2 DEBUG oslo_concurrency.lockutils [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.714 2 DEBUG nova.compute.manager [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:05 compute-0 nova_compute[192810]: 2025-09-30 21:37:05.714 2 WARNING nova.compute.manager [req-e3f1ee46-258c-4342-a58c-4c1c83bb2a9e req-e8994357-3051-469d-ba17-65bcc619d8d4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state active and task_state rescuing.
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.079 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268211.0780828, fc7fafcc-8033-4e17-b63e-a79fced7ec46 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.079 2 INFO nova.compute.manager [-] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] VM Stopped (Lifecycle Event)
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.110 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.111 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268226.1102386, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.112 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Resumed (Lifecycle Event)
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.133 2 DEBUG nova.compute.manager [None req-bb606845-dc29-4392-827c-2fd4eb55fbf5 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.178 2 DEBUG nova.compute.manager [None req-66f4820a-c3f6-4faa-822a-9de0948af950 - - - - - -] [instance: fc7fafcc-8033-4e17-b63e-a79fced7ec46] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.179 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.182 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.218 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.219 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268226.1112623, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.219 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Started (Lifecycle Event)
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.244 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:06 compute-0 nova_compute[192810]: 2025-09-30 21:37:06.247 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.743 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268212.7418594, 4255e358-6db2-4947-a3d6-4e045f9235fb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.743 2 INFO nova.compute.manager [-] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] VM Stopped (Lifecycle Event)
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.764 2 DEBUG nova.compute.manager [None req-6da323a4-4abc-45a3-a9bd-bf7ecdc4aa15 - - - - - -] [instance: 4255e358-6db2-4947-a3d6-4e045f9235fb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.826 2 DEBUG nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.826 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.827 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.827 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.827 2 DEBUG nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.828 2 WARNING nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state rescued and task_state None.
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.828 2 DEBUG nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.828 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.828 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.829 2 DEBUG oslo_concurrency.lockutils [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.829 2 DEBUG nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:07 compute-0 nova_compute[192810]: 2025-09-30 21:37:07.829 2 WARNING nova.compute.manager [req-93c53c8a-395e-4e70-94f8-cebc4ce3bbed req-02765fb0-819f-49cb-9255-9f48fe7dcb2c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state rescued and task_state None.
Sep 30 21:37:08 compute-0 podman[237279]: 2025-09-30 21:37:08.315297112 +0000 UTC m=+0.050680224 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:37:08 compute-0 podman[237280]: 2025-09-30 21:37:08.323054955 +0000 UTC m=+0.058046788 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:37:09 compute-0 nova_compute[192810]: 2025-09-30 21:37:09.205 2 DEBUG nova.compute.manager [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:09 compute-0 nova_compute[192810]: 2025-09-30 21:37:09.205 2 DEBUG nova.compute.manager [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing instance network info cache due to event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:09 compute-0 nova_compute[192810]: 2025-09-30 21:37:09.206 2 DEBUG oslo_concurrency.lockutils [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:09 compute-0 nova_compute[192810]: 2025-09-30 21:37:09.206 2 DEBUG oslo_concurrency.lockutils [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:09 compute-0 nova_compute[192810]: 2025-09-30 21:37:09.206 2 DEBUG nova.network.neutron [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:11 compute-0 nova_compute[192810]: 2025-09-30 21:37:11.082 2 DEBUG nova.network.neutron [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated VIF entry in instance network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:11 compute-0 nova_compute[192810]: 2025-09-30 21:37:11.083 2 DEBUG nova.network.neutron [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:12 compute-0 nova_compute[192810]: 2025-09-30 21:37:12.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:12 compute-0 nova_compute[192810]: 2025-09-30 21:37:12.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:12 compute-0 nova_compute[192810]: 2025-09-30 21:37:12.787 2 DEBUG oslo_concurrency.lockutils [req-b8cb1cf2-2ed9-4d62-999f-b377ced749d3 req-8b12ed11-ff8b-4451-854a-bb47c0428bae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:13 compute-0 nova_compute[192810]: 2025-09-30 21:37:13.195 2 DEBUG nova.compute.manager [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:13 compute-0 nova_compute[192810]: 2025-09-30 21:37:13.196 2 DEBUG nova.compute.manager [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing instance network info cache due to event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:13 compute-0 nova_compute[192810]: 2025-09-30 21:37:13.196 2 DEBUG oslo_concurrency.lockutils [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:13 compute-0 nova_compute[192810]: 2025-09-30 21:37:13.196 2 DEBUG oslo_concurrency.lockutils [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:13 compute-0 nova_compute[192810]: 2025-09-30 21:37:13.196 2 DEBUG nova.network.neutron [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:14 compute-0 podman[237326]: 2025-09-30 21:37:14.318972312 +0000 UTC m=+0.053105214 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:37:14 compute-0 podman[237325]: 2025-09-30 21:37:14.329517185 +0000 UTC m=+0.065211696 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:37:14 compute-0 podman[237324]: 2025-09-30 21:37:14.336354636 +0000 UTC m=+0.074662092 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:37:15 compute-0 nova_compute[192810]: 2025-09-30 21:37:15.022 2 DEBUG nova.network.neutron [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated VIF entry in instance network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:15 compute-0 nova_compute[192810]: 2025-09-30 21:37:15.022 2 DEBUG nova.network.neutron [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:15 compute-0 nova_compute[192810]: 2025-09-30 21:37:15.048 2 DEBUG oslo_concurrency.lockutils [req-fb559915-cba2-49c5-8153-30829318451a req-8cbe70cb-fb29-48c8-b58a-b01e51500fd2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-0 NetworkManager[51733]: <info>  [1759268237.0511] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/189)
Sep 30 21:37:17 compute-0 NetworkManager[51733]: <info>  [1759268237.0524] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.407 2 DEBUG nova.compute.manager [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.408 2 DEBUG nova.compute.manager [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing instance network info cache due to event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.409 2 DEBUG oslo_concurrency.lockutils [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.409 2 DEBUG oslo_concurrency.lockutils [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.410 2 DEBUG nova.network.neutron [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:17 compute-0 nova_compute[192810]: 2025-09-30 21:37:17.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:18 compute-0 nova_compute[192810]: 2025-09-30 21:37:18.716 2 DEBUG nova.network.neutron [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated VIF entry in instance network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:18 compute-0 nova_compute[192810]: 2025-09-30 21:37:18.716 2 DEBUG nova.network.neutron [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:18 compute-0 nova_compute[192810]: 2025-09-30 21:37:18.734 2 DEBUG oslo_concurrency.lockutils [req-0c797a27-a50a-42dd-a21d-a562e9e1e593 req-6cafad33-64fd-429e-aebb-5e785fee2276 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.309 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.309 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.353 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.474 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.476 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.482 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.482 2 INFO nova.compute.claims [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.780 2 DEBUG nova.compute.provider_tree [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.810 2 DEBUG nova.scheduler.client.report [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.866 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.867 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.946 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.946 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:37:19 compute-0 nova_compute[192810]: 2025-09-30 21:37:19.975 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.005 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.126 2 DEBUG nova.policy [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cbebeb1f78b64ee09e2da39a04f9f282', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '27ad5d27d7a44404987f6bf297897a45', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.133 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.135 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.135 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Creating image(s)
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.136 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.136 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.136 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.148 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.208 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.209 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.210 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.220 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.272 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.273 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.337 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.338 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.339 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.390 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.391 2 DEBUG nova.virt.disk.api [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Checking if we can resize image /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.391 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.445 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.446 2 DEBUG nova.virt.disk.api [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Cannot resize image /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.447 2 DEBUG nova.objects.instance [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'migration_context' on Instance uuid b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.471 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.472 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Ensure instance console log exists: /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.472 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.473 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.473 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:20 compute-0 nova_compute[192810]: 2025-09-30 21:37:20.688 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Successfully created port: 1208ea00-ed22-4f18-83a5-e09220676a9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:37:21 compute-0 nova_compute[192810]: 2025-09-30 21:37:21.069 2 DEBUG nova.compute.manager [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:21 compute-0 nova_compute[192810]: 2025-09-30 21:37:21.069 2 DEBUG nova.compute.manager [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing instance network info cache due to event network-changed-7e8f7bb1-f155-40d6-8e85-f4222da3c027. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:21 compute-0 nova_compute[192810]: 2025-09-30 21:37:21.069 2 DEBUG oslo_concurrency.lockutils [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:21 compute-0 nova_compute[192810]: 2025-09-30 21:37:21.070 2 DEBUG oslo_concurrency.lockutils [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:21 compute-0 nova_compute[192810]: 2025-09-30 21:37:21.070 2 DEBUG nova.network.neutron [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Refreshing network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.103 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Successfully updated port: 1208ea00-ed22-4f18-83a5-e09220676a9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.133 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.133 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquired lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.134 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.361 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.735 2 DEBUG nova.network.neutron [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updated VIF entry in instance network info cache for port 7e8f7bb1-f155-40d6-8e85-f4222da3c027. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.736 2 DEBUG nova.network.neutron [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [{"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.760 2 DEBUG oslo_concurrency.lockutils [req-40c2fe4c-2c60-4b9f-87e1-89e59ca15143 req-fb130a59-32e9-4ac6-85ac-982292d2b173 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:22 compute-0 nova_compute[192810]: 2025-09-30 21:37:22.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.077 2 DEBUG nova.compute.manager [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-changed-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.078 2 DEBUG nova.compute.manager [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Refreshing instance network info cache due to event network-changed-1208ea00-ed22-4f18-83a5-e09220676a9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.079 2 DEBUG oslo_concurrency.lockutils [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.338 2 DEBUG nova.network.neutron [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Updating instance_info_cache with network_info: [{"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.394 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Releasing lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.395 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Instance network_info: |[{"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.396 2 DEBUG oslo_concurrency.lockutils [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.396 2 DEBUG nova.network.neutron [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Refreshing network info cache for port 1208ea00-ed22-4f18-83a5-e09220676a9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.401 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Start _get_guest_xml network_info=[{"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.406 2 WARNING nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.413 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.414 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.418 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.419 2 DEBUG nova.virt.libvirt.host [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.422 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.423 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.424 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.424 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.424 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.425 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.425 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.425 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.425 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.426 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.426 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.426 2 DEBUG nova.virt.hardware [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.430 2 DEBUG nova.virt.libvirt.vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-3',id=115,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:20Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=b5f1bc30-6de7-46a0-8a4b-6f32146f61c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.431 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.431 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.432 2 DEBUG nova.objects.instance [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'pci_devices' on Instance uuid b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.447 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <uuid>b5f1bc30-6de7-46a0-8a4b-6f32146f61c0</uuid>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <name>instance-00000073</name>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:name>tempest-ListServersNegativeTestJSON-server-2130176929-3</nova:name>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:37:23</nova:creationTime>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:user uuid="cbebeb1f78b64ee09e2da39a04f9f282">tempest-ListServersNegativeTestJSON-1844496645-project-member</nova:user>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:project uuid="27ad5d27d7a44404987f6bf297897a45">tempest-ListServersNegativeTestJSON-1844496645</nova:project>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         <nova:port uuid="1208ea00-ed22-4f18-83a5-e09220676a9b">
Sep 30 21:37:23 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <system>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="serial">b5f1bc30-6de7-46a0-8a4b-6f32146f61c0</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="uuid">b5f1bc30-6de7-46a0-8a4b-6f32146f61c0</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </system>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <os>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </os>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <features>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </features>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.config"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:2b:2c:6a"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <target dev="tap1208ea00-ed"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/console.log" append="off"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <video>
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </video>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:37:23 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:37:23 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:37:23 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:37:23 compute-0 nova_compute[192810]: </domain>
Sep 30 21:37:23 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.449 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Preparing to wait for external event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.449 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.450 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.450 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.451 2 DEBUG nova.virt.libvirt.vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-3',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-3',id=115,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=2,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:37:20Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=b5f1bc30-6de7-46a0-8a4b-6f32146f61c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.451 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.452 2 DEBUG nova.network.os_vif_util [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.452 2 DEBUG os_vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.459 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1208ea00-ed, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.459 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1208ea00-ed, col_values=(('external_ids', {'iface-id': '1208ea00-ed22-4f18-83a5-e09220676a9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:2c:6a', 'vm-uuid': 'b5f1bc30-6de7-46a0-8a4b-6f32146f61c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-0 NetworkManager[51733]: <info>  [1759268243.4621] manager: (tap1208ea00-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.468 2 INFO os_vif [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed')
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.660 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.660 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.660 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] No VIF found with MAC fa:16:3e:2b:2c:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:37:23 compute-0 nova_compute[192810]: 2025-09-30 21:37:23.661 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Using config drive
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.160 2 INFO nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Creating config drive at /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.config
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.166 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5h26ld8n execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.298 2 DEBUG oslo_concurrency.processutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5h26ld8n" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.3923] manager: (tap1208ea00-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/192)
Sep 30 21:37:24 compute-0 kernel: tap1208ea00-ed: entered promiscuous mode
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 ovn_controller[94912]: 2025-09-30T21:37:24Z|00431|binding|INFO|Claiming lport 1208ea00-ed22-4f18-83a5-e09220676a9b for this chassis.
Sep 30 21:37:24 compute-0 ovn_controller[94912]: 2025-09-30T21:37:24Z|00432|binding|INFO|1208ea00-ed22-4f18-83a5-e09220676a9b: Claiming fa:16:3e:2b:2c:6a 10.100.0.7
Sep 30 21:37:24 compute-0 ovn_controller[94912]: 2025-09-30T21:37:24Z|00433|binding|INFO|Setting lport 1208ea00-ed22-4f18-83a5-e09220676a9b ovn-installed in OVS
Sep 30 21:37:24 compute-0 ovn_controller[94912]: 2025-09-30T21:37:24Z|00434|binding|INFO|Setting lport 1208ea00-ed22-4f18-83a5-e09220676a9b up in Southbound
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.427 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:2c:6a 10.100.0.7'], port_security=['fa:16:3e:2b:2c:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b5f1bc30-6de7-46a0-8a4b-6f32146f61c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27ad5d27d7a44404987f6bf297897a45', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd191d10a-3559-4b1f-8832-a4135c99c39a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8f74c6-45db-4076-ad90-3fc15f718204, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=1208ea00-ed22-4f18-83a5-e09220676a9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.428 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 1208ea00-ed22-4f18-83a5-e09220676a9b in datapath 6815a373-206c-4f26-aa16-cc2d72d0f14d bound to our chassis
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.429 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.443 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[783c6aad-4c28-42f3-863f-1328c45530f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.444 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6815a373-21 in ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.472 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6815a373-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.472 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c2479810-23fd-45f4-9076-93b0bde79cc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.473 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9569dd74-1984-4fce-8b0b-9f0f62fe3492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 systemd-udevd[237430]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.486 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad0f883-ead3-42db-907a-e36f14521aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 systemd-machined[152794]: New machine qemu-54-instance-00000073.
Sep 30 21:37:24 compute-0 systemd[1]: Started Virtual Machine qemu-54-instance-00000073.
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.5005] device (tap1208ea00-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.5014] device (tap1208ea00-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.511 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5481401e-9330-40b4-ac7a-79f38917b18a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.551 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e13bc625-b357-45b2-a938-2e13dd52a26e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.555 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ac28151a-36d4-4de6-b8e2-5471a8ff0ae6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 systemd-udevd[237435]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.5588] manager: (tap6815a373-20): new Veth device (/org/freedesktop/NetworkManager/Devices/193)
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.590 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4d5834f8-197b-496b-a54c-eeded31cdf65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.594 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8124754b-cd76-4e4e-a438-d32972506f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.6190] device (tap6815a373-20): carrier: link connected
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.625 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[48fe38d3-8011-4beb-b4e1-6e8a690a6381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.642 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dc1b3151-11e2-4c60-a7a6-8d22a2ab4f86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6815a373-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:9d:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494024, 'reachable_time': 33280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237463, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.656 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[07c6df96-2eec-4f5f-ad02-917d5f16a63e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:9dfc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 494024, 'tstamp': 494024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237464, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.673 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[06900b43-ab37-4c7a-ac08-0f6c2961bf35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6815a373-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:9d:fc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 129], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494024, 'reachable_time': 33280, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237465, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.707 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e87037bd-d8de-4d1e-a048-9fbc177e967d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.795 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd7edcc-4f32-4703-a64d-1149af2ed35d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.797 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6815a373-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.797 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.797 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6815a373-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 NetworkManager[51733]: <info>  [1759268244.8016] manager: (tap6815a373-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Sep 30 21:37:24 compute-0 kernel: tap6815a373-20: entered promiscuous mode
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.805 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6815a373-20, col_values=(('external_ids', {'iface-id': '54338240-c8d5-4182-8bce-5e3e54cd794d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 ovn_controller[94912]: 2025-09-30T21:37:24Z|00435|binding|INFO|Releasing lport 54338240-c8d5-4182-8bce-5e3e54cd794d from this chassis (sb_readonly=0)
Sep 30 21:37:24 compute-0 nova_compute[192810]: 2025-09-30 21:37:24.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.833 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.834 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[993bb964-3bc3-48f8-a32e-5770730c6f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.834 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/6815a373-206c-4f26-aa16-cc2d72d0f14d.pid.haproxy
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 6815a373-206c-4f26-aa16-cc2d72d0f14d
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:37:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:24.836 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'env', 'PROCESS_TAG=haproxy-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6815a373-206c-4f26-aa16-cc2d72d0f14d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:37:25 compute-0 podman[237504]: 2025-09-30 21:37:25.260498297 +0000 UTC m=+0.055520935 container create a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:37:25 compute-0 systemd[1]: Started libpod-conmon-a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1.scope.
Sep 30 21:37:25 compute-0 podman[237504]: 2025-09-30 21:37:25.228569822 +0000 UTC m=+0.023592480 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:37:25 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:37:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45ca80e6a370b6b001b70b43a90ec8218e0b878d7262bd223fb8e828739c7b56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:37:25 compute-0 podman[237504]: 2025-09-30 21:37:25.368267993 +0000 UTC m=+0.163290641 container init a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:37:25 compute-0 podman[237504]: 2025-09-30 21:37:25.374924709 +0000 UTC m=+0.169947347 container start a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:37:25 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [NOTICE]   (237523) : New worker (237525) forked
Sep 30 21:37:25 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [NOTICE]   (237523) : Loading success.
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.546 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268245.5457718, b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.548 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] VM Started (Lifecycle Event)
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.614 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.621 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268245.5460527, b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.622 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] VM Paused (Lifecycle Event)
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.675 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.678 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.711 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:25.769 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:25 compute-0 nova_compute[192810]: 2025-09-30 21:37:25.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:25.771 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.236 2 DEBUG nova.compute.manager [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.237 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.238 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.238 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.238 2 DEBUG nova.compute.manager [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Processing event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.238 2 DEBUG nova.compute.manager [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.239 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.239 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.239 2 DEBUG oslo_concurrency.lockutils [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.239 2 DEBUG nova.compute.manager [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] No waiting events found dispatching network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.240 2 WARNING nova.compute.manager [req-c1290f9d-86ac-4a7d-8429-b7b5b2e0fad3 req-011d7f81-13b7-4cb0-a7bf-7ce6502c086c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received unexpected event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b for instance with vm_state building and task_state spawning.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.241 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.245 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268246.2452142, b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.245 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] VM Resumed (Lifecycle Event)
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.249 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.252 2 INFO nova.virt.libvirt.driver [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Instance spawned successfully.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.252 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.318 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.325 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.395 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.397 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.398 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.398 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.399 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.405 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.412 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.413 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.414 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.415 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.416 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.417 2 DEBUG nova.virt.libvirt.driver [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.425 2 DEBUG nova.network.neutron [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Updated VIF entry in instance network info cache for port 1208ea00-ed22-4f18-83a5-e09220676a9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.426 2 DEBUG nova.network.neutron [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Updating instance_info_cache with network_info: [{"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.466 2 INFO nova.compute.manager [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Terminating instance
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.481 2 DEBUG nova.compute.manager [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.492 2 DEBUG oslo_concurrency.lockutils [req-c7aad8f9-aa21-4ef4-82fa-8a85f8da1bdf req-f5558a25-6488-4c3c-b088-19e6e538da39 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:26 compute-0 kernel: tap7e8f7bb1-f1 (unregistering): left promiscuous mode
Sep 30 21:37:26 compute-0 NetworkManager[51733]: <info>  [1759268246.5135] device (tap7e8f7bb1-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:37:26 compute-0 ovn_controller[94912]: 2025-09-30T21:37:26Z|00436|binding|INFO|Releasing lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 from this chassis (sb_readonly=0)
Sep 30 21:37:26 compute-0 ovn_controller[94912]: 2025-09-30T21:37:26Z|00437|binding|INFO|Setting lport 7e8f7bb1-f155-40d6-8e85-f4222da3c027 down in Southbound
Sep 30 21:37:26 compute-0 ovn_controller[94912]: 2025-09-30T21:37:26Z|00438|binding|INFO|Removing iface tap7e8f7bb1-f1 ovn-installed in OVS
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:26.565 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:ba:8b 10.100.0.7'], port_security=['fa:16:3e:86:ba:8b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6f659d19-7261-4309-81bc-b7bb9a9da219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '793bce1316e34dada7414560b74789f4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5fd3b1da-f9d4-4128-be3a-89511bb35b84', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63360a12-7489-4e6b-8985-1b0f556f3468, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7e8f7bb1-f155-40d6-8e85-f4222da3c027) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:26.566 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7e8f7bb1-f155-40d6-8e85-f4222da3c027 in datapath 6f659d19-7261-4309-81bc-b7bb9a9da219 unbound from our chassis
Sep 30 21:37:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:26.567 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6f659d19-7261-4309-81bc-b7bb9a9da219 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:37:26 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:26.568 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[898fa813-df8a-48fe-88a1-973410353215]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:26 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Sep 30 21:37:26 compute-0 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d0000006e.scope: Consumed 12.705s CPU time.
Sep 30 21:37:26 compute-0 systemd-machined[152794]: Machine qemu-53-instance-0000006e terminated.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.604 2 INFO nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Took 6.47 seconds to spawn the instance on the hypervisor.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.605 2 DEBUG nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.784 2 INFO nova.compute.manager [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Took 7.35 seconds to build instance.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.792 2 INFO nova.virt.libvirt.driver [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Instance destroyed successfully.
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.792 2 DEBUG nova.objects.instance [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lazy-loading 'resources' on Instance uuid 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.811 2 DEBUG nova.virt.libvirt.vif [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSONUnderV235-server-76631792',display_name='tempest-ServerRescueTestJSONUnderV235-server-76631792',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjsonunderv235-server-76631792',id=110,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:37:06Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='793bce1316e34dada7414560b74789f4',ramdisk_id='',reservation_id='r-pjsi62w2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSONUnderV235-672740030',owner_user_name='tempest-ServerRescueTestJSONUnderV235-672740030-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:37:06Z,user_data=None,user_id='90078b78b6fc4222ab0254ae1eda2500',uuid=01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.811 2 DEBUG nova.network.os_vif_util [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converting VIF {"id": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "address": "fa:16:3e:86:ba:8b", "network": {"id": "6f659d19-7261-4309-81bc-b7bb9a9da219", "bridge": "br-int", "label": "tempest-ServerRescueTestJSONUnderV235-1930279008-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "793bce1316e34dada7414560b74789f4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7e8f7bb1-f1", "ovs_interfaceid": "7e8f7bb1-f155-40d6-8e85-f4222da3c027", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.812 2 DEBUG nova.network.os_vif_util [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.813 2 DEBUG os_vif [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7e8f7bb1-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.822 2 INFO os_vif [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:ba:8b,bridge_name='br-int',has_traffic_filtering=True,id=7e8f7bb1-f155-40d6-8e85-f4222da3c027,network=Network(6f659d19-7261-4309-81bc-b7bb9a9da219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7e8f7bb1-f1')
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.823 2 INFO nova.virt.libvirt.driver [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Deleting instance files /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6_del
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.825 2 INFO nova.virt.libvirt.driver [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Deletion of /var/lib/nova/instances/01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6_del complete
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.832 2 DEBUG oslo_concurrency.lockutils [None req-9c41c29a-5ade-450c-be9d-6d47cb310407 cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.935 2 DEBUG nova.compute.manager [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.936 2 DEBUG oslo_concurrency.lockutils [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.936 2 DEBUG oslo_concurrency.lockutils [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.936 2 DEBUG oslo_concurrency.lockutils [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.937 2 DEBUG nova.compute.manager [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:26 compute-0 nova_compute[192810]: 2025-09-30 21:37:26.937 2 DEBUG nova.compute.manager [req-369b2e81-5e49-4779-9971-fd6c9ec1826b req-cc74e964-954c-43f5-b212-539329eadb46 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-unplugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:37:27 compute-0 nova_compute[192810]: 2025-09-30 21:37:27.068 2 INFO nova.compute.manager [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Took 0.59 seconds to destroy the instance on the hypervisor.
Sep 30 21:37:27 compute-0 nova_compute[192810]: 2025-09-30 21:37:27.069 2 DEBUG oslo.service.loopingcall [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:37:27 compute-0 nova_compute[192810]: 2025-09-30 21:37:27.069 2 DEBUG nova.compute.manager [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:37:27 compute-0 nova_compute[192810]: 2025-09-30 21:37:27.069 2 DEBUG nova.network.neutron [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:37:27 compute-0 nova_compute[192810]: 2025-09-30 21:37:27.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:28 compute-0 podman[237561]: 2025-09-30 21:37:28.344238219 +0000 UTC m=+0.066785206 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:37:28 compute-0 podman[237560]: 2025-09-30 21:37:28.369083578 +0000 UTC m=+0.098400014 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:37:28 compute-0 podman[237559]: 2025-09-30 21:37:28.371076538 +0000 UTC m=+0.097956993 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.736 2 DEBUG nova.network.neutron [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.834 2 DEBUG nova.compute.manager [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.834 2 DEBUG oslo_concurrency.lockutils [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.834 2 DEBUG oslo_concurrency.lockutils [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.835 2 DEBUG oslo_concurrency.lockutils [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.835 2 DEBUG nova.compute.manager [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] No waiting events found dispatching network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.835 2 WARNING nova.compute.manager [req-0e741d3f-10df-41b9-9488-15d41e526a1d req-d421401f-4aa6-4185-8bd8-1bd5360d8691 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received unexpected event network-vif-plugged-7e8f7bb1-f155-40d6-8e85-f4222da3c027 for instance with vm_state rescued and task_state deleting.
Sep 30 21:37:29 compute-0 nova_compute[192810]: 2025-09-30 21:37:29.901 2 INFO nova.compute.manager [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Took 2.83 seconds to deallocate network for instance.
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.360 2 DEBUG nova.compute.manager [req-627dfd03-4fb2-43ef-b56c-9103da5246e1 req-932ca5e4-4667-4ccd-860b-eb5a52df3945 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Received event network-vif-deleted-7e8f7bb1-f155-40d6-8e85-f4222da3c027 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.597 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.598 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.703 2 DEBUG nova.compute.provider_tree [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:30.773 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.782 2 DEBUG nova.scheduler.client.report [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.910 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:30 compute-0 nova_compute[192810]: 2025-09-30 21:37:30.951 2 INFO nova.scheduler.client.report [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Deleted allocations for instance 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6
Sep 30 21:37:31 compute-0 nova_compute[192810]: 2025-09-30 21:37:31.318 2 DEBUG oslo_concurrency.lockutils [None req-ea10fd5a-fff1-4b9e-a54f-0a917170e930 90078b78b6fc4222ab0254ae1eda2500 793bce1316e34dada7414560b74789f4 - - default default] Lock "01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.921s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:31 compute-0 nova_compute[192810]: 2025-09-30 21:37:31.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:32 compute-0 nova_compute[192810]: 2025-09-30 21:37:32.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:36 compute-0 nova_compute[192810]: 2025-09-30 21:37:36.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:37 compute-0 nova_compute[192810]: 2025-09-30 21:37:37.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:37 compute-0 ovn_controller[94912]: 2025-09-30T21:37:37Z|00439|binding|INFO|Releasing lport 54338240-c8d5-4182-8bce-5e3e54cd794d from this chassis (sb_readonly=0)
Sep 30 21:37:37 compute-0 nova_compute[192810]: 2025-09-30 21:37:37.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:37 compute-0 ovn_controller[94912]: 2025-09-30T21:37:37Z|00440|binding|INFO|Releasing lport 54338240-c8d5-4182-8bce-5e3e54cd794d from this chassis (sb_readonly=0)
Sep 30 21:37:37 compute-0 nova_compute[192810]: 2025-09-30 21:37:37.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:38.743 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:38.743 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:38.744 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:39 compute-0 podman[237635]: 2025-09-30 21:37:39.317436772 +0000 UTC m=+0.057698369 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:37:39 compute-0 podman[237636]: 2025-09-30 21:37:39.346267121 +0000 UTC m=+0.085073102 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6)
Sep 30 21:37:41 compute-0 nova_compute[192810]: 2025-09-30 21:37:41.789 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268246.7857544, 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:41 compute-0 nova_compute[192810]: 2025-09-30 21:37:41.789 2 INFO nova.compute.manager [-] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] VM Stopped (Lifecycle Event)
Sep 30 21:37:41 compute-0 nova_compute[192810]: 2025-09-30 21:37:41.815 2 DEBUG nova.compute.manager [None req-b65534e4-98e3-4cf7-9214-c985c35961e5 - - - - - -] [instance: 01e91d9c-ae57-4e8e-8c2f-e13c0dc3cfc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:41 compute-0 nova_compute[192810]: 2025-09-30 21:37:41.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:42 compute-0 nova_compute[192810]: 2025-09-30 21:37:42.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.888 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.889 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.889 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.889 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.889 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.901 2 INFO nova.compute.manager [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Terminating instance
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.911 2 DEBUG nova.compute.manager [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:37:44 compute-0 kernel: tap1208ea00-ed (unregistering): left promiscuous mode
Sep 30 21:37:44 compute-0 NetworkManager[51733]: <info>  [1759268264.9463] device (tap1208ea00-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:37:44 compute-0 ovn_controller[94912]: 2025-09-30T21:37:44Z|00441|binding|INFO|Releasing lport 1208ea00-ed22-4f18-83a5-e09220676a9b from this chassis (sb_readonly=0)
Sep 30 21:37:44 compute-0 ovn_controller[94912]: 2025-09-30T21:37:44Z|00442|binding|INFO|Setting lport 1208ea00-ed22-4f18-83a5-e09220676a9b down in Southbound
Sep 30 21:37:44 compute-0 ovn_controller[94912]: 2025-09-30T21:37:44Z|00443|binding|INFO|Removing iface tap1208ea00-ed ovn-installed in OVS
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:44.966 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:2c:6a 10.100.0.7'], port_security=['fa:16:3e:2b:2c:6a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b5f1bc30-6de7-46a0-8a4b-6f32146f61c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '27ad5d27d7a44404987f6bf297897a45', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd191d10a-3559-4b1f-8832-a4135c99c39a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a8f74c6-45db-4076-ad90-3fc15f718204, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=1208ea00-ed22-4f18-83a5-e09220676a9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:37:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:44.968 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 1208ea00-ed22-4f18-83a5-e09220676a9b in datapath 6815a373-206c-4f26-aa16-cc2d72d0f14d unbound from our chassis
Sep 30 21:37:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:44.968 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6815a373-206c-4f26-aa16-cc2d72d0f14d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:37:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:44.970 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cca6e447-3566-4c39-90b7-5b0f952c96fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:44.971 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d namespace which is not needed anymore
Sep 30 21:37:44 compute-0 nova_compute[192810]: 2025-09-30 21:37:44.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:45 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000073.scope: Deactivated successfully.
Sep 30 21:37:45 compute-0 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000073.scope: Consumed 13.545s CPU time.
Sep 30 21:37:45 compute-0 systemd-machined[152794]: Machine qemu-54-instance-00000073 terminated.
Sep 30 21:37:45 compute-0 podman[237690]: 2025-09-30 21:37:45.036430867 +0000 UTC m=+0.054479199 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:37:45 compute-0 podman[237686]: 2025-09-30 21:37:45.037239147 +0000 UTC m=+0.060773006 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:37:45 compute-0 podman[237691]: 2025-09-30 21:37:45.052489517 +0000 UTC m=+0.060587961 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.170 2 INFO nova.virt.libvirt.driver [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Instance destroyed successfully.
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.171 2 DEBUG nova.objects.instance [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lazy-loading 'resources' on Instance uuid b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [NOTICE]   (237523) : haproxy version is 2.8.14-c23fe91
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [NOTICE]   (237523) : path to executable is /usr/sbin/haproxy
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [WARNING]  (237523) : Exiting Master process...
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [WARNING]  (237523) : Exiting Master process...
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [ALERT]    (237523) : Current worker (237525) exited with code 143 (Terminated)
Sep 30 21:37:45 compute-0 neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d[237519]: [WARNING]  (237523) : All workers exited. Exiting... (0)
Sep 30 21:37:45 compute-0 systemd[1]: libpod-a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1.scope: Deactivated successfully.
Sep 30 21:37:45 compute-0 podman[237768]: 2025-09-30 21:37:45.184118958 +0000 UTC m=+0.120402662 container died a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.185 2 DEBUG nova.virt.libvirt.vif [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:37:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServersNegativeTestJSON-server-2130176929',display_name='tempest-ListServersNegativeTestJSON-server-2130176929-3',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-listserversnegativetestjson-server-2130176929-3',id=115,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=2,launched_at=2025-09-30T21:37:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='27ad5d27d7a44404987f6bf297897a45',ramdisk_id='',reservation_id='r-msrdvef1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServersNegativeTestJSON-1844496645',owner_user_name='tempest-ListServersNegativeTestJSON-1844496645-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:37:26Z,user_data=None,user_id='cbebeb1f78b64ee09e2da39a04f9f282',uuid=b5f1bc30-6de7-46a0-8a4b-6f32146f61c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.186 2 DEBUG nova.network.os_vif_util [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converting VIF {"id": "1208ea00-ed22-4f18-83a5-e09220676a9b", "address": "fa:16:3e:2b:2c:6a", "network": {"id": "6815a373-206c-4f26-aa16-cc2d72d0f14d", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-1587191426-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "27ad5d27d7a44404987f6bf297897a45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1208ea00-ed", "ovs_interfaceid": "1208ea00-ed22-4f18-83a5-e09220676a9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.186 2 DEBUG nova.network.os_vif_util [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.186 2 DEBUG os_vif [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.188 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1208ea00-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.193 2 INFO os_vif [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:2c:6a,bridge_name='br-int',has_traffic_filtering=True,id=1208ea00-ed22-4f18-83a5-e09220676a9b,network=Network(6815a373-206c-4f26-aa16-cc2d72d0f14d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1208ea00-ed')
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.193 2 INFO nova.virt.libvirt.driver [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Deleting instance files /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0_del
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.194 2 INFO nova.virt.libvirt.driver [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Deletion of /var/lib/nova/instances/b5f1bc30-6de7-46a0-8a4b-6f32146f61c0_del complete
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.275 2 INFO nova.compute.manager [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.275 2 DEBUG oslo.service.loopingcall [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.276 2 DEBUG nova.compute.manager [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.276 2 DEBUG nova.network.neutron [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:37:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1-userdata-shm.mount: Deactivated successfully.
Sep 30 21:37:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-45ca80e6a370b6b001b70b43a90ec8218e0b878d7262bd223fb8e828739c7b56-merged.mount: Deactivated successfully.
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.545 2 DEBUG nova.compute.manager [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-unplugged-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.545 2 DEBUG oslo_concurrency.lockutils [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.545 2 DEBUG oslo_concurrency.lockutils [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.546 2 DEBUG oslo_concurrency.lockutils [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.546 2 DEBUG nova.compute.manager [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] No waiting events found dispatching network-vif-unplugged-1208ea00-ed22-4f18-83a5-e09220676a9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.546 2 DEBUG nova.compute.manager [req-1ca19594-d9c7-4cc2-90f1-44851fd13a92 req-7e6d9d84-7b85-41c1-acda-c3739d2eeb8a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-unplugged-1208ea00-ed22-4f18-83a5-e09220676a9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:37:45 compute-0 podman[237768]: 2025-09-30 21:37:45.55408937 +0000 UTC m=+0.490373074 container cleanup a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:37:45 compute-0 systemd[1]: libpod-conmon-a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1.scope: Deactivated successfully.
Sep 30 21:37:45 compute-0 podman[237815]: 2025-09-30 21:37:45.938309436 +0000 UTC m=+0.356007344 container remove a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:37:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:45.943 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[baa5561e-68d0-40cd-8a1e-33aa9dd0124e]: (4, ('Tue Sep 30 09:37:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d (a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1)\na1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1\nTue Sep 30 09:37:45 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d (a1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1)\na1b5b204c8a4c6dc481b2f0632d194b6d5ddbae7494c44e7bb0d94cd63db83a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:45.945 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3396eea4-3146-45b1-a6b6-afe070527432]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:45.946 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6815a373-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:45 compute-0 kernel: tap6815a373-20: left promiscuous mode
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:45 compute-0 nova_compute[192810]: 2025-09-30 21:37:45.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:45.968 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[72088aac-b041-469c-ae52-d580b2f3df97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:46.012 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb965bf0-54f1-4c08-9482-315a3a710da2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:46.014 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f1bd550d-59ec-463c-85fc-8079c0cf8f5d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:46.029 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d60878-0ace-4cf5-aadb-351399d8ec27]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 494016, 'reachable_time': 27240, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237831, 'error': None, 'target': 'ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:46.033 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6815a373-206c-4f26-aa16-cc2d72d0f14d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:37:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:37:46.034 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[23eae2d6-d057-4bc7-b85f-cb0dfed6f66c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:37:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d6815a373\x2d206c\x2d4f26\x2daa16\x2dcc2d72d0f14d.mount: Deactivated successfully.
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.381 2 DEBUG nova.network.neutron [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.438 2 INFO nova.compute.manager [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Took 1.16 seconds to deallocate network for instance.
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.539 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.540 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.646 2 DEBUG nova.compute.provider_tree [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.679 2 DEBUG nova.scheduler.client.report [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.714 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.750 2 INFO nova.scheduler.client.report [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Deleted allocations for instance b5f1bc30-6de7-46a0-8a4b-6f32146f61c0
Sep 30 21:37:46 compute-0 nova_compute[192810]: 2025-09-30 21:37:46.880 2 DEBUG oslo_concurrency.lockutils [None req-5fa85bfb-bc04-4dfd-8566-9d4e10c9f4fb cbebeb1f78b64ee09e2da39a04f9f282 27ad5d27d7a44404987f6bf297897a45 - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.992s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.514 2 DEBUG nova.compute.manager [req-b2ee3991-967a-4dac-9f3c-5848d7054db4 req-e886c6b8-be12-4e2e-9120-566ad0c96525 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-deleted-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.774 2 DEBUG nova.compute.manager [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.775 2 DEBUG oslo_concurrency.lockutils [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.775 2 DEBUG oslo_concurrency.lockutils [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.775 2 DEBUG oslo_concurrency.lockutils [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b5f1bc30-6de7-46a0-8a4b-6f32146f61c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.776 2 DEBUG nova.compute.manager [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] No waiting events found dispatching network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:37:47 compute-0 nova_compute[192810]: 2025-09-30 21:37:47.776 2 WARNING nova.compute.manager [req-587c7305-1a8c-407b-996d-7a197fffed8b req-090becf7-dd79-4383-9251-2509f5201511 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Received unexpected event network-vif-plugged-1208ea00-ed22-4f18-83a5-e09220676a9b for instance with vm_state deleted and task_state None.
Sep 30 21:37:48 compute-0 nova_compute[192810]: 2025-09-30 21:37:48.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:48 compute-0 nova_compute[192810]: 2025-09-30 21:37:48.938 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "9aa3f231-a017-464a-94ed-54988ab74c72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:48 compute-0 nova_compute[192810]: 2025-09-30 21:37:48.938 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:48 compute-0 nova_compute[192810]: 2025-09-30 21:37:48.978 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.088 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.089 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.093 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.094 2 INFO nova.compute.claims [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.245 2 DEBUG nova.compute.provider_tree [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.257 2 DEBUG nova.scheduler.client.report [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.285 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.286 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.375 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.413 2 INFO nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.444 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.596 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.597 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.597 2 INFO nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating image(s)
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.598 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.598 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.599 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.610 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.660 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.661 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.661 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.671 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.721 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:49 compute-0 nova_compute[192810]: 2025-09-30 21:37:49.722 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.068 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk 1073741824" returned: 0 in 0.345s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.069 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.070 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.158 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.159 2 DEBUG nova.virt.disk.api [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Checking if we can resize image /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.160 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.253 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.254 2 DEBUG nova.virt.disk.api [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Cannot resize image /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.254 2 DEBUG nova.objects.instance [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'migration_context' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.286 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.287 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Ensure instance console log exists: /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.287 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.288 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.288 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.290 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.294 2 WARNING nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.299 2 DEBUG nova.virt.libvirt.host [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.300 2 DEBUG nova.virt.libvirt.host [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.304 2 DEBUG nova.virt.libvirt.host [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.304 2 DEBUG nova.virt.libvirt.host [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.305 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.306 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.306 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.307 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.307 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.307 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.307 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.308 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.308 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.308 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.308 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.309 2 DEBUG nova.virt.hardware [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.312 2 DEBUG nova.objects.instance [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.325 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <uuid>9aa3f231-a017-464a-94ed-54988ab74c72</uuid>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <name>instance-00000075</name>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerShowV247Test-server-1339578104</nova:name>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:37:50</nova:creationTime>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:user uuid="412e69dc72344bedbec091339f7d6245">tempest-ServerShowV247Test-1509763863-project-member</nova:user>
Sep 30 21:37:50 compute-0 nova_compute[192810]:         <nova:project uuid="95a31e99a053405eafa577fcf2b95e55">tempest-ServerShowV247Test-1509763863</nova:project>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <system>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="serial">9aa3f231-a017-464a-94ed-54988ab74c72</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="uuid">9aa3f231-a017-464a-94ed-54988ab74c72</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </system>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <os>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </os>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <features>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </features>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/console.log" append="off"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <video>
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </video>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:37:50 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:37:50 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:37:50 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:37:50 compute-0 nova_compute[192810]: </domain>
Sep 30 21:37:50 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.459 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.459 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.460 2 INFO nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Using config drive
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.726 2 INFO nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating config drive at /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.730 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpth7xvoiz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:50 compute-0 nova_compute[192810]: 2025-09-30 21:37:50.852 2 DEBUG oslo_concurrency.processutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpth7xvoiz" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:50 compute-0 systemd-machined[152794]: New machine qemu-55-instance-00000075.
Sep 30 21:37:50 compute-0 systemd[1]: Started Virtual Machine qemu-55-instance-00000075.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.729 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268271.728658, 9aa3f231-a017-464a-94ed-54988ab74c72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.729 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] VM Resumed (Lifecycle Event)
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.732 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.732 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.735 2 INFO nova.virt.libvirt.driver [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance spawned successfully.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.736 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.773 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.778 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.780 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.780 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.780 2 DEBUG nova.virt.libvirt.driver [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.784 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.811 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.812 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268271.7287908, 9aa3f231-a017-464a-94ed-54988ab74c72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.812 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] VM Started (Lifecycle Event)
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.840 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.842 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.861 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.871 2 INFO nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Took 2.27 seconds to spawn the instance on the hypervisor.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.871 2 DEBUG nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.958 2 INFO nova.compute.manager [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Took 2.90 seconds to build instance.
Sep 30 21:37:51 compute-0 nova_compute[192810]: 2025-09-30 21:37:51.987 2 DEBUG oslo_concurrency.lockutils [None req-36960378-3d1a-4caa-993c-ad55e5e78321 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 3.049s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:52 compute-0 nova_compute[192810]: 2025-09-30 21:37:52.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:52 compute-0 nova_compute[192810]: 2025-09-30 21:37:52.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:53 compute-0 nova_compute[192810]: 2025-09-30 21:37:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:54 compute-0 nova_compute[192810]: 2025-09-30 21:37:54.782 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:54 compute-0 sshd-session[237875]: Invalid user xtest from 45.81.23.80 port 42106
Sep 30 21:37:54 compute-0 sshd-session[237875]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:37:54 compute-0 sshd-session[237875]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:37:55 compute-0 nova_compute[192810]: 2025-09-30 21:37:55.093 2 INFO nova.compute.manager [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Rebuilding instance
Sep 30 21:37:55 compute-0 nova_compute[192810]: 2025-09-30 21:37:55.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:55 compute-0 nova_compute[192810]: 2025-09-30 21:37:55.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:55 compute-0 nova_compute[192810]: 2025-09-30 21:37:55.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:37:55 compute-0 nova_compute[192810]: 2025-09-30 21:37:55.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:37:56 compute-0 nova_compute[192810]: 2025-09-30 21:37:56.401 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:37:56 compute-0 nova_compute[192810]: 2025-09-30 21:37:56.402 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:37:56 compute-0 nova_compute[192810]: 2025-09-30 21:37:56.402 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:37:56 compute-0 nova_compute[192810]: 2025-09-30 21:37:56.402 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:56 compute-0 nova_compute[192810]: 2025-09-30 21:37:56.738 2 DEBUG nova.compute.manager [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:37:57 compute-0 sshd-session[237875]: Failed password for invalid user xtest from 45.81.23.80 port 42106 ssh2
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.097 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.107 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.206 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:57 compute-0 sshd-session[237875]: Received disconnect from 45.81.23.80 port 42106:11: Bye Bye [preauth]
Sep 30 21:37:57 compute-0 sshd-session[237875]: Disconnected from invalid user xtest 45.81.23.80 port 42106 [preauth]
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.287 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'resources' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.349 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'migration_context' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.443 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.446 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.860 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.892 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.893 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.893 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.894 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.894 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.963 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.964 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.964 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:57 compute-0 nova_compute[192810]: 2025-09-30 21:37:57.965 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.117 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.174 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.175 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.229 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.358 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.359 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5621MB free_disk=73.24957656860352GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.360 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.360 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.441 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 9aa3f231-a017-464a-94ed-54988ab74c72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.442 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.442 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.495 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.511 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.533 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:37:58 compute-0 nova_compute[192810]: 2025-09-30 21:37:58.533 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:37:59 compute-0 podman[237885]: 2025-09-30 21:37:59.34284941 +0000 UTC m=+0.066685383 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:37:59 compute-0 podman[237886]: 2025-09-30 21:37:59.345115056 +0000 UTC m=+0.067696688 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:37:59 compute-0 podman[237884]: 2025-09-30 21:37:59.373979356 +0000 UTC m=+0.097226665 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:37:59 compute-0 nova_compute[192810]: 2025-09-30 21:37:59.427 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:37:59 compute-0 nova_compute[192810]: 2025-09-30 21:37:59.463 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:00 compute-0 nova_compute[192810]: 2025-09-30 21:38:00.169 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268265.1677437, b5f1bc30-6de7-46a0-8a4b-6f32146f61c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:00 compute-0 nova_compute[192810]: 2025-09-30 21:38:00.169 2 INFO nova.compute.manager [-] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] VM Stopped (Lifecycle Event)
Sep 30 21:38:00 compute-0 nova_compute[192810]: 2025-09-30 21:38:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:00 compute-0 nova_compute[192810]: 2025-09-30 21:38:00.218 2 DEBUG nova.compute.manager [None req-a58fdcf6-5041-47cc-87a3-3a16648dcc60 - - - - - -] [instance: b5f1bc30-6de7-46a0-8a4b-6f32146f61c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:01 compute-0 nova_compute[192810]: 2025-09-30 21:38:01.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:02 compute-0 nova_compute[192810]: 2025-09-30 21:38:02.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:05 compute-0 nova_compute[192810]: 2025-09-30 21:38:05.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:07 compute-0 nova_compute[192810]: 2025-09-30 21:38:07.488 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:38:07 compute-0 nova_compute[192810]: 2025-09-30 21:38:07.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:09 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Deactivated successfully.
Sep 30 21:38:09 compute-0 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000075.scope: Consumed 14.053s CPU time.
Sep 30 21:38:09 compute-0 systemd-machined[152794]: Machine qemu-55-instance-00000075 terminated.
Sep 30 21:38:09 compute-0 podman[237962]: 2025-09-30 21:38:09.963132697 +0000 UTC m=+0.063655967 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:38:09 compute-0 podman[237963]: 2025-09-30 21:38:09.963917937 +0000 UTC m=+0.063116244 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.500 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance shutdown successfully after 13 seconds.
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.504 2 INFO nova.virt.libvirt.driver [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance destroyed successfully.
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.510 2 INFO nova.virt.libvirt.driver [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance destroyed successfully.
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.510 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Deleting instance files /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72_del
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.511 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Deletion of /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72_del complete
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.857 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.858 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating image(s)
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.859 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.859 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.860 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.872 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.963 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.964 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.965 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:10 compute-0 nova_compute[192810]: 2025-09-30 21:38:10.975 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.046 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.047 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.580 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk 1073741824" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.583 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.584 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.671 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.673 2 DEBUG nova.virt.disk.api [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Checking if we can resize image /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.674 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.741 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.743 2 DEBUG nova.virt.disk.api [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Cannot resize image /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.744 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.745 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Ensure instance console log exists: /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.745 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.746 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.747 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.750 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.758 2 WARNING nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.767 2 DEBUG nova.virt.libvirt.host [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.768 2 DEBUG nova.virt.libvirt.host [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.773 2 DEBUG nova.virt.libvirt.host [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.773 2 DEBUG nova.virt.libvirt.host [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.775 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.776 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.777 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.777 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.778 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.778 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.778 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.779 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.779 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.780 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.780 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.781 2 DEBUG nova.virt.hardware [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.781 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:11 compute-0 nova_compute[192810]: 2025-09-30 21:38:11.817 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <uuid>9aa3f231-a017-464a-94ed-54988ab74c72</uuid>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <name>instance-00000075</name>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerShowV247Test-server-1339578104</nova:name>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:38:11</nova:creationTime>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:user uuid="412e69dc72344bedbec091339f7d6245">tempest-ServerShowV247Test-1509763863-project-member</nova:user>
Sep 30 21:38:11 compute-0 nova_compute[192810]:         <nova:project uuid="95a31e99a053405eafa577fcf2b95e55">tempest-ServerShowV247Test-1509763863</nova:project>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <system>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="serial">9aa3f231-a017-464a-94ed-54988ab74c72</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="uuid">9aa3f231-a017-464a-94ed-54988ab74c72</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </system>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <os>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </os>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <features>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </features>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/console.log" append="off"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <video>
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </video>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:38:11 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:38:11 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:38:11 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:38:11 compute-0 nova_compute[192810]: </domain>
Sep 30 21:38:11 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.025 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.025 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.026 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Using config drive
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.050 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.110 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'keypairs' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:12 compute-0 nova_compute[192810]: 2025-09-30 21:38:12.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:13 compute-0 nova_compute[192810]: 2025-09-30 21:38:13.012 2 INFO nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Creating config drive at /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config
Sep 30 21:38:13 compute-0 nova_compute[192810]: 2025-09-30 21:38:13.016 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpff1sxdc3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:13 compute-0 nova_compute[192810]: 2025-09-30 21:38:13.138 2 DEBUG oslo_concurrency.processutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpff1sxdc3" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:13 compute-0 systemd-machined[152794]: New machine qemu-56-instance-00000075.
Sep 30 21:38:13 compute-0 systemd[1]: Started Virtual Machine qemu-56-instance-00000075.
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.173 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 9aa3f231-a017-464a-94ed-54988ab74c72 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.174 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268294.1733875, 9aa3f231-a017-464a-94ed-54988ab74c72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.174 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] VM Resumed (Lifecycle Event)
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.177 2 DEBUG nova.compute.manager [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.177 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.180 2 INFO nova.virt.libvirt.driver [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance spawned successfully.
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.181 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.205 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.208 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.239 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.240 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.240 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.240 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.241 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.241 2 DEBUG nova.virt.libvirt.driver [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.265 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.266 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268294.1742291, 9aa3f231-a017-464a-94ed-54988ab74c72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.266 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] VM Started (Lifecycle Event)
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.303 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.307 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.337 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.355 2 DEBUG nova.compute.manager [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.525 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.525 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.526 2 DEBUG nova.objects.instance [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:38:14 compute-0 nova_compute[192810]: 2025-09-30 21:38:14.620 2 DEBUG oslo_concurrency.lockutils [None req-1cb1a8fd-8ea5-410a-a84f-104b11f2f00c 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:15 compute-0 nova_compute[192810]: 2025-09-30 21:38:15.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:15 compute-0 podman[238059]: 2025-09-30 21:38:15.316872898 +0000 UTC m=+0.050538731 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:38:15 compute-0 podman[238058]: 2025-09-30 21:38:15.324997621 +0000 UTC m=+0.062566581 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS)
Sep 30 21:38:15 compute-0 podman[238057]: 2025-09-30 21:38:15.327437141 +0000 UTC m=+0.067191125 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd)
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.415 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "9aa3f231-a017-464a-94ed-54988ab74c72" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.415 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.415 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "9aa3f231-a017-464a-94ed-54988ab74c72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.416 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.416 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.427 2 INFO nova.compute.manager [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Terminating instance
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.439 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.439 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquired lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.440 2 DEBUG nova.network.neutron [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:17 compute-0 nova_compute[192810]: 2025-09-30 21:38:17.901 2 DEBUG nova.network.neutron [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.016 2 DEBUG nova.network.neutron [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.037 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Releasing lock "refresh_cache-9aa3f231-a017-464a-94ed-54988ab74c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.038 2 DEBUG nova.compute.manager [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:38:19 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000075.scope: Deactivated successfully.
Sep 30 21:38:19 compute-0 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000075.scope: Consumed 5.694s CPU time.
Sep 30 21:38:19 compute-0 systemd-machined[152794]: Machine qemu-56-instance-00000075 terminated.
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.283 2 INFO nova.virt.libvirt.driver [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance destroyed successfully.
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.284 2 DEBUG nova.objects.instance [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lazy-loading 'resources' on Instance uuid 9aa3f231-a017-464a-94ed-54988ab74c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.309 2 INFO nova.virt.libvirt.driver [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Deleting instance files /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72_del
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.311 2 INFO nova.virt.libvirt.driver [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Deletion of /var/lib/nova/instances/9aa3f231-a017-464a-94ed-54988ab74c72_del complete
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.490 2 INFO nova.compute.manager [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Took 0.45 seconds to destroy the instance on the hypervisor.
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.491 2 DEBUG oslo.service.loopingcall [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.491 2 DEBUG nova.compute.manager [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:38:19 compute-0 nova_compute[192810]: 2025-09-30 21:38:19.491 2 DEBUG nova.network.neutron [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.015 2 DEBUG nova.network.neutron [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.030 2 DEBUG nova.network.neutron [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.052 2 INFO nova.compute.manager [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Took 0.56 seconds to deallocate network for instance.
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.179 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.179 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.280 2 DEBUG nova.compute.provider_tree [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.320 2 DEBUG nova.scheduler.client.report [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.376 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.443 2 INFO nova.scheduler.client.report [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Deleted allocations for instance 9aa3f231-a017-464a-94ed-54988ab74c72
Sep 30 21:38:20 compute-0 nova_compute[192810]: 2025-09-30 21:38:20.534 2 DEBUG oslo_concurrency.lockutils [None req-c8d8be89-0b4c-445b-9820-c7dc90215eca 412e69dc72344bedbec091339f7d6245 95a31e99a053405eafa577fcf2b95e55 - - default default] Lock "9aa3f231-a017-464a-94ed-54988ab74c72" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:22 compute-0 nova_compute[192810]: 2025-09-30 21:38:22.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.455 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.456 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.481 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.573 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.573 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.578 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.578 2 INFO nova.compute.claims [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.771 2 DEBUG nova.compute.provider_tree [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.783 2 DEBUG nova.scheduler.client.report [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.841 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.843 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.906 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.907 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.931 2 INFO nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:38:25 compute-0 nova_compute[192810]: 2025-09-30 21:38:25.947 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.137 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.138 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.139 2 INFO nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Creating image(s)
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.139 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.139 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.140 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.153 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.206 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.207 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.207 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.217 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.267 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.268 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.351 2 DEBUG nova.policy [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.376 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk 1073741824" returned: 0 in 0.108s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.377 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.377 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.430 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.431 2 DEBUG nova.virt.disk.api [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.431 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.487 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.488 2 DEBUG nova.virt.disk.api [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.489 2 DEBUG nova.objects.instance [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid e9e8dea0-93b5-463c-92da-bdac362c3cd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.515 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.516 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Ensure instance console log exists: /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.516 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.516 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:26 compute-0 nova_compute[192810]: 2025-09-30 21:38:26.517 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:27 compute-0 nova_compute[192810]: 2025-09-30 21:38:27.385 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Successfully created port: 148524e4-e92e-4e58-8b7c-ae7abeb61875 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:38:27 compute-0 nova_compute[192810]: 2025-09-30 21:38:27.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.511 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Successfully updated port: 148524e4-e92e-4e58-8b7c-ae7abeb61875 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.525 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.525 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.526 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.658 2 DEBUG nova.compute.manager [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-changed-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.658 2 DEBUG nova.compute.manager [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Refreshing instance network info cache due to event network-changed-148524e4-e92e-4e58-8b7c-ae7abeb61875. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.659 2 DEBUG oslo_concurrency.lockutils [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:29 compute-0 nova_compute[192810]: 2025-09-30 21:38:29.795 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:30 compute-0 nova_compute[192810]: 2025-09-30 21:38:30.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:30.241 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:30.242 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:38:30 compute-0 nova_compute[192810]: 2025-09-30 21:38:30.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:30 compute-0 podman[238146]: 2025-09-30 21:38:30.368572059 +0000 UTC m=+0.094813084 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:38:30 compute-0 podman[238145]: 2025-09-30 21:38:30.407572971 +0000 UTC m=+0.137064427 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:38:30 compute-0 podman[238147]: 2025-09-30 21:38:30.413502099 +0000 UTC m=+0.123684774 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:38:30 compute-0 ovn_controller[94912]: 2025-09-30T21:38:30Z|00444|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.700 2 DEBUG nova.network.neutron [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Updating instance_info_cache with network_info: [{"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.778 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.778 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Instance network_info: |[{"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.780 2 DEBUG oslo_concurrency.lockutils [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.780 2 DEBUG nova.network.neutron [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Refreshing network info cache for port 148524e4-e92e-4e58-8b7c-ae7abeb61875 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.785 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Start _get_guest_xml network_info=[{"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.790 2 WARNING nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.796 2 DEBUG nova.virt.libvirt.host [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.797 2 DEBUG nova.virt.libvirt.host [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.800 2 DEBUG nova.virt.libvirt.host [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.801 2 DEBUG nova.virt.libvirt.host [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.801 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.802 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.802 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.802 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.802 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.803 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.804 2 DEBUG nova.virt.hardware [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.807 2 DEBUG nova.virt.libvirt.vif [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-534496747',display_name='tempest-ServersTestJSON-server-534496747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-534496747',id=121,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-2mmq2way',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:25Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=e9e8dea0-93b5-463c-92da-bdac362c3cd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.807 2 DEBUG nova.network.os_vif_util [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.808 2 DEBUG nova.network.os_vif_util [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.808 2 DEBUG nova.objects.instance [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid e9e8dea0-93b5-463c-92da-bdac362c3cd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.825 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <uuid>e9e8dea0-93b5-463c-92da-bdac362c3cd9</uuid>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <name>instance-00000079</name>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersTestJSON-server-534496747</nova:name>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:38:31</nova:creationTime>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         <nova:port uuid="148524e4-e92e-4e58-8b7c-ae7abeb61875">
Sep 30 21:38:31 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <system>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="serial">e9e8dea0-93b5-463c-92da-bdac362c3cd9</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="uuid">e9e8dea0-93b5-463c-92da-bdac362c3cd9</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </system>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <os>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </os>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <features>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </features>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.config"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:29:f1:48"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <target dev="tap148524e4-e9"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/console.log" append="off"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <video>
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </video>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:38:31 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:38:31 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:38:31 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:38:31 compute-0 nova_compute[192810]: </domain>
Sep 30 21:38:31 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.826 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Preparing to wait for external event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.826 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.826 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.826 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.827 2 DEBUG nova.virt.libvirt.vif [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-534496747',display_name='tempest-ServersTestJSON-server-534496747',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-534496747',id=121,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-2mmq2way',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:25Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=e9e8dea0-93b5-463c-92da-bdac362c3cd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.827 2 DEBUG nova.network.os_vif_util [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.828 2 DEBUG nova.network.os_vif_util [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.828 2 DEBUG os_vif [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.833 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap148524e4-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.833 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap148524e4-e9, col_values=(('external_ids', {'iface-id': '148524e4-e92e-4e58-8b7c-ae7abeb61875', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:29:f1:48', 'vm-uuid': 'e9e8dea0-93b5-463c-92da-bdac362c3cd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:31 compute-0 NetworkManager[51733]: <info>  [1759268311.8357] manager: (tap148524e4-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.843 2 INFO os_vif [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9')
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.924 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.924 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.924 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:29:f1:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:38:31 compute-0 nova_compute[192810]: 2025-09-30 21:38:31.925 2 INFO nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Using config drive
Sep 30 21:38:32 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:32.244 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:32 compute-0 nova_compute[192810]: 2025-09-30 21:38:32.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.065 2 INFO nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Creating config drive at /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.config
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.070 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4f8jdo7b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.193 2 DEBUG oslo_concurrency.processutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4f8jdo7b" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:33 compute-0 kernel: tap148524e4-e9: entered promiscuous mode
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.2443] manager: (tap148524e4-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/196)
Sep 30 21:38:33 compute-0 ovn_controller[94912]: 2025-09-30T21:38:33Z|00445|binding|INFO|Claiming lport 148524e4-e92e-4e58-8b7c-ae7abeb61875 for this chassis.
Sep 30 21:38:33 compute-0 ovn_controller[94912]: 2025-09-30T21:38:33Z|00446|binding|INFO|148524e4-e92e-4e58-8b7c-ae7abeb61875: Claiming fa:16:3e:29:f1:48 10.100.0.11
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.264 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:f1:48 10.100.0.11'], port_security=['fa:16:3e:29:f1:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e9e8dea0-93b5-463c-92da-bdac362c3cd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=148524e4-e92e-4e58-8b7c-ae7abeb61875) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.265 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 148524e4-e92e-4e58-8b7c-ae7abeb61875 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.266 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:33 compute-0 systemd-udevd[238228]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.277 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4608d1-7a65-4604-9cd7-4fcfa1019a46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 systemd-machined[152794]: New machine qemu-57-instance-00000079.
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.278 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27086519-61 in ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.280 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27086519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.280 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8888ba-441f-4d57-98eb-63fc7a57878f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.282 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[05f8b3a1-402d-4bcc-a6cd-bab49a80c41e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.2893] device (tap148524e4-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.2902] device (tap148524e4-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.291 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[2bf357cd-99d5-45d2-90b7-6da270266d18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.308 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e274f446-c0d4-4c2e-a816-9376ee9a2b86]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 ovn_controller[94912]: 2025-09-30T21:38:33Z|00447|binding|INFO|Setting lport 148524e4-e92e-4e58-8b7c-ae7abeb61875 ovn-installed in OVS
Sep 30 21:38:33 compute-0 ovn_controller[94912]: 2025-09-30T21:38:33Z|00448|binding|INFO|Setting lport 148524e4-e92e-4e58-8b7c-ae7abeb61875 up in Southbound
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 systemd[1]: Started Virtual Machine qemu-57-instance-00000079.
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.337 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[26f32e67-40d5-4d4e-8c24-a9a80e866846]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 systemd-udevd[238231]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.3427] manager: (tap27086519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/197)
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.341 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[282e5799-f60a-4065-9311-3027aea05a84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.376 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[208e8184-9d3a-4529-af1c-b67edce62146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.379 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f75200f7-3a79-4ad5-882e-f105b26e91db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.3984] device (tap27086519-60): carrier: link connected
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.402 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba0be1f-b5d6-429d-b337-5c03af5b45b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.419 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5fafd482-0a12-4a8d-abea-af2c11ca6da7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500902, 'reachable_time': 26548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238260, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.433 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7785361a-d6f5-4560-b692-ea2b93fe4f9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:b9e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500902, 'tstamp': 500902}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238261, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.449 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d39a7991-1769-4c1e-b054-eb23286a2182]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500902, 'reachable_time': 26548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238262, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.478 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[06d29dc4-9b21-4fc3-8fee-b9fa96ea1aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.534 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9965de78-abc4-4cc0-8c9a-33b0f3a14282]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.536 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.536 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.537 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 kernel: tap27086519-60: entered promiscuous mode
Sep 30 21:38:33 compute-0 NetworkManager[51733]: <info>  [1759268313.5914] manager: (tap27086519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.595 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 ovn_controller[94912]: 2025-09-30T21:38:33Z|00449|binding|INFO|Releasing lport f2abb4ad-797b-4767-b8bc-377990516394 from this chassis (sb_readonly=0)
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.597 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.598 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[44c90445-40b6-4f72-9289-11d0b419aab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.599 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:38:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:33.599 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'env', 'PROCESS_TAG=haproxy-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27086519-6f4c-45f9-8e5b-5b321cd6871c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:38:33 compute-0 nova_compute[192810]: 2025-09-30 21:38:33.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:34 compute-0 podman[238301]: 2025-09-30 21:38:33.939086463 +0000 UTC m=+0.022585144 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:38:34 compute-0 podman[238301]: 2025-09-30 21:38:34.273808145 +0000 UTC m=+0.357306806 container create 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.278 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268314.277013, e9e8dea0-93b5-463c-92da-bdac362c3cd9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.278 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] VM Started (Lifecycle Event)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.281 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268299.280184, 9aa3f231-a017-464a-94ed-54988ab74c72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.282 2 INFO nova.compute.manager [-] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] VM Stopped (Lifecycle Event)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.304 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.308 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268314.2776623, e9e8dea0-93b5-463c-92da-bdac362c3cd9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.308 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] VM Paused (Lifecycle Event)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.311 2 DEBUG nova.compute.manager [None req-7b7f7efe-390d-43c5-9715-044dda544993 - - - - - -] [instance: 9aa3f231-a017-464a-94ed-54988ab74c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.341 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.344 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.367 2 DEBUG nova.compute.manager [req-1f4e23dd-0afe-4f71-ab83-a5eb9e11b43e req-de708f0c-cb96-4383-a774-44274ad3b036 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.368 2 DEBUG oslo_concurrency.lockutils [req-1f4e23dd-0afe-4f71-ab83-a5eb9e11b43e req-de708f0c-cb96-4383-a774-44274ad3b036 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.368 2 DEBUG oslo_concurrency.lockutils [req-1f4e23dd-0afe-4f71-ab83-a5eb9e11b43e req-de708f0c-cb96-4383-a774-44274ad3b036 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.368 2 DEBUG oslo_concurrency.lockutils [req-1f4e23dd-0afe-4f71-ab83-a5eb9e11b43e req-de708f0c-cb96-4383-a774-44274ad3b036 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.368 2 DEBUG nova.compute.manager [req-1f4e23dd-0afe-4f71-ab83-a5eb9e11b43e req-de708f0c-cb96-4383-a774-44274ad3b036 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Processing event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.369 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.372 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.375 2 INFO nova.virt.libvirt.driver [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Instance spawned successfully.
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.375 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.383 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.384 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268314.3717506, e9e8dea0-93b5-463c-92da-bdac362c3cd9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.384 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] VM Resumed (Lifecycle Event)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.405 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.406 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.407 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.407 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.408 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.408 2 DEBUG nova.virt.libvirt.driver [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.417 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.420 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.450 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:34 compute-0 systemd[1]: Started libpod-conmon-4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58.scope.
Sep 30 21:38:34 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:38:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/491c7a0480db6f542f63ffd9280ca629c0fdb0792d591beac7bf51d09d8b4056/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.548 2 INFO nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Took 8.41 seconds to spawn the instance on the hypervisor.
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.549 2 DEBUG nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.643 2 INFO nova.compute.manager [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Took 9.11 seconds to build instance.
Sep 30 21:38:34 compute-0 nova_compute[192810]: 2025-09-30 21:38:34.673 2 DEBUG oslo_concurrency.lockutils [None req-b705a22f-df23-4d5d-9290-1e26f259b03c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:34 compute-0 podman[238301]: 2025-09-30 21:38:34.736389665 +0000 UTC m=+0.819888346 container init 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:38:34 compute-0 podman[238301]: 2025-09-30 21:38:34.741941663 +0000 UTC m=+0.825440324 container start 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:38:34 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [NOTICE]   (238320) : New worker (238322) forked
Sep 30 21:38:34 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [NOTICE]   (238320) : Loading success.
Sep 30 21:38:35 compute-0 nova_compute[192810]: 2025-09-30 21:38:35.517 2 DEBUG nova.network.neutron [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Updated VIF entry in instance network info cache for port 148524e4-e92e-4e58-8b7c-ae7abeb61875. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:38:35 compute-0 nova_compute[192810]: 2025-09-30 21:38:35.518 2 DEBUG nova.network.neutron [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Updating instance_info_cache with network_info: [{"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:35 compute-0 nova_compute[192810]: 2025-09-30 21:38:35.539 2 DEBUG oslo_concurrency.lockutils [req-a8c2e21d-2f92-4aa1-8f78-1d9e7e07ca2e req-ba3fc134-d22f-4664-8c52-0485ead8093b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-e9e8dea0-93b5-463c-92da-bdac362c3cd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:36 compute-0 nova_compute[192810]: 2025-09-30 21:38:36.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.420 2 DEBUG nova.compute.manager [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.421 2 DEBUG oslo_concurrency.lockutils [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.421 2 DEBUG oslo_concurrency.lockutils [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.421 2 DEBUG oslo_concurrency.lockutils [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.422 2 DEBUG nova.compute.manager [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] No waiting events found dispatching network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.422 2 WARNING nova.compute.manager [req-4423ba1b-0d69-403d-993d-84600feb97c1 req-d47caa2f-b54d-4487-982c-9b247c1bb7f7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received unexpected event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 for instance with vm_state active and task_state None.
Sep 30 21:38:37 compute-0 nova_compute[192810]: 2025-09-30 21:38:37.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.323 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.324 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.324 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.324 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.325 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.335 2 INFO nova.compute.manager [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Terminating instance
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.345 2 DEBUG nova.compute.manager [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:38:38 compute-0 kernel: tap148524e4-e9 (unregistering): left promiscuous mode
Sep 30 21:38:38 compute-0 NetworkManager[51733]: <info>  [1759268318.3629] device (tap148524e4-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:38:38 compute-0 ovn_controller[94912]: 2025-09-30T21:38:38Z|00450|binding|INFO|Releasing lport 148524e4-e92e-4e58-8b7c-ae7abeb61875 from this chassis (sb_readonly=0)
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 ovn_controller[94912]: 2025-09-30T21:38:38Z|00451|binding|INFO|Setting lport 148524e4-e92e-4e58-8b7c-ae7abeb61875 down in Southbound
Sep 30 21:38:38 compute-0 ovn_controller[94912]: 2025-09-30T21:38:38Z|00452|binding|INFO|Removing iface tap148524e4-e9 ovn-installed in OVS
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.378 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:f1:48 10.100.0.11'], port_security=['fa:16:3e:29:f1:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'e9e8dea0-93b5-463c-92da-bdac362c3cd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=148524e4-e92e-4e58-8b7c-ae7abeb61875) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.380 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 148524e4-e92e-4e58-8b7c-ae7abeb61875 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.381 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27086519-6f4c-45f9-8e5b-5b321cd6871c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.382 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89fe6610-687d-4c9a-91a3-6a16f53cb824]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.383 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace which is not needed anymore
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000079.scope: Deactivated successfully.
Sep 30 21:38:38 compute-0 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d00000079.scope: Consumed 4.855s CPU time.
Sep 30 21:38:38 compute-0 systemd-machined[152794]: Machine qemu-57-instance-00000079 terminated.
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [NOTICE]   (238320) : haproxy version is 2.8.14-c23fe91
Sep 30 21:38:38 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [NOTICE]   (238320) : path to executable is /usr/sbin/haproxy
Sep 30 21:38:38 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [WARNING]  (238320) : Exiting Master process...
Sep 30 21:38:38 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [ALERT]    (238320) : Current worker (238322) exited with code 143 (Terminated)
Sep 30 21:38:38 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238316]: [WARNING]  (238320) : All workers exited. Exiting... (0)
Sep 30 21:38:38 compute-0 systemd[1]: libpod-4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58.scope: Deactivated successfully.
Sep 30 21:38:38 compute-0 podman[238355]: 2025-09-30 21:38:38.585739228 +0000 UTC m=+0.110381872 container died 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.601 2 INFO nova.virt.libvirt.driver [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Instance destroyed successfully.
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.602 2 DEBUG nova.objects.instance [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid e9e8dea0-93b5-463c-92da-bdac362c3cd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.620 2 DEBUG nova.virt.libvirt.vif [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=2001:2001::3,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-534496747',display_name='tempest-ServersTestJSON-server-534496747',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-534496747',id=121,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:38:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-2mmq2way',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:38:34Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=e9e8dea0-93b5-463c-92da-bdac362c3cd9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.620 2 DEBUG nova.network.os_vif_util [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "address": "fa:16:3e:29:f1:48", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap148524e4-e9", "ovs_interfaceid": "148524e4-e92e-4e58-8b7c-ae7abeb61875", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.621 2 DEBUG nova.network.os_vif_util [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.622 2 DEBUG os_vif [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.624 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap148524e4-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.629 2 INFO os_vif [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:29:f1:48,bridge_name='br-int',has_traffic_filtering=True,id=148524e4-e92e-4e58-8b7c-ae7abeb61875,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap148524e4-e9')
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.630 2 INFO nova.virt.libvirt.driver [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Deleting instance files /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9_del
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.631 2 INFO nova.virt.libvirt.driver [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Deletion of /var/lib/nova/instances/e9e8dea0-93b5-463c-92da-bdac362c3cd9_del complete
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.725 2 INFO nova.compute.manager [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.726 2 DEBUG oslo.service.loopingcall [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.726 2 DEBUG nova.compute.manager [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:38:38 compute-0 nova_compute[192810]: 2025-09-30 21:38:38.727 2 DEBUG nova.network.neutron [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.743 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.744 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:38.744 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58-userdata-shm.mount: Deactivated successfully.
Sep 30 21:38:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-491c7a0480db6f542f63ffd9280ca629c0fdb0792d591beac7bf51d09d8b4056-merged.mount: Deactivated successfully.
Sep 30 21:38:38 compute-0 podman[238355]: 2025-09-30 21:38:38.956705765 +0000 UTC m=+0.481348399 container cleanup 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:38:38 compute-0 systemd[1]: libpod-conmon-4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58.scope: Deactivated successfully.
Sep 30 21:38:39 compute-0 podman[238402]: 2025-09-30 21:38:39.217839914 +0000 UTC m=+0.240339112 container remove 4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b0f21213-3ae6-49a2-beb6-4861161fe44f]: (4, ('Tue Sep 30 09:38:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58)\n4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58\nTue Sep 30 09:38:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58)\n4dd372c03fcc21551d62b9ca245ee285211d998bf3b44c8e40366ad4cc7a5a58\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.224 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb43c565-af07-4f1c-8205-c2ef90ca8dde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.225 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:39 compute-0 kernel: tap27086519-60: left promiscuous mode
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.241 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2654eb4-a007-4474-ae37-cb09de5e11f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.265 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[988eb83b-3fa2-4fdd-a2bb-5db53d378702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.267 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2fa926-b2dc-4c9e-ae2d-d5fe83d61f92]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.284 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1007eaf9-03c6-41e5-b054-66dc2c750dd5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500895, 'reachable_time': 16923, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238417, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.287 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:38:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:39.287 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[f891fc47-6fc4-4392-8482-ae4bd6fe08ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d27086519\x2d6f4c\x2d45f9\x2d8e5b\x2d5b321cd6871c.mount: Deactivated successfully.
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.537 2 DEBUG nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-unplugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.537 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.537 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.538 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.538 2 DEBUG nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] No waiting events found dispatching network-vif-unplugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.538 2 DEBUG nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-unplugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.538 2 DEBUG nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.539 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.539 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.539 2 DEBUG oslo_concurrency.lockutils [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.539 2 DEBUG nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] No waiting events found dispatching network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.540 2 WARNING nova.compute.manager [req-6ec418a1-8f09-4218-95c9-ad50513c247a req-12664656-2a32-4d1b-a834-e0cb285b9297 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received unexpected event network-vif-plugged-148524e4-e92e-4e58-8b7c-ae7abeb61875 for instance with vm_state active and task_state deleting.
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.540 2 DEBUG nova.network.neutron [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.559 2 INFO nova.compute.manager [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Took 0.83 seconds to deallocate network for instance.
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.630 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.630 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.631 2 DEBUG nova.compute.manager [req-7731c8f0-8387-4dd6-8998-2b4a4636a624 req-cf9887bf-7b6b-4708-8724-cdfe2d2795da dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Received event network-vif-deleted-148524e4-e92e-4e58-8b7c-ae7abeb61875 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.691 2 DEBUG nova.compute.provider_tree [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.708 2 DEBUG nova.scheduler.client.report [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.729 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.762 2 INFO nova.scheduler.client.report [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance e9e8dea0-93b5-463c-92da-bdac362c3cd9
Sep 30 21:38:39 compute-0 nova_compute[192810]: 2025-09-30 21:38:39.836 2 DEBUG oslo_concurrency.lockutils [None req-2de73e4b-3dc6-48d9-9f1f-9dc064451709 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e9e8dea0-93b5-463c-92da-bdac362c3cd9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.513s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:40 compute-0 podman[238418]: 2025-09-30 21:38:40.311860712 +0000 UTC m=+0.052324525 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:38:40 compute-0 podman[238419]: 2025-09-30 21:38:40.317608135 +0000 UTC m=+0.058312964 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Sep 30 21:38:42 compute-0 nova_compute[192810]: 2025-09-30 21:38:42.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:43 compute-0 nova_compute[192810]: 2025-09-30 21:38:43.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:38:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.501 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.502 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.552 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.677 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.678 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.684 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.684 2 INFO nova.compute.claims [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.805 2 DEBUG nova.compute.provider_tree [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.829 2 DEBUG nova.scheduler.client.report [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.852 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.853 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.925 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.926 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.946 2 INFO nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:38:45 compute-0 nova_compute[192810]: 2025-09-30 21:38:45.968 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.176 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.177 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.178 2 INFO nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Creating image(s)
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.178 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.178 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.179 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.191 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.258 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.259 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.260 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.271 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.296 2 DEBUG nova.policy [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:38:46 compute-0 podman[238465]: 2025-09-30 21:38:46.330377691 +0000 UTC m=+0.060413586 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:38:46 compute-0 podman[238462]: 2025-09-30 21:38:46.330474314 +0000 UTC m=+0.064978971 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.343 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.344 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:46 compute-0 podman[238466]: 2025-09-30 21:38:46.350570525 +0000 UTC m=+0.076329144 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.526 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk 1073741824" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.528 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.528 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.598 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.599 2 DEBUG nova.virt.disk.api [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.599 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.656 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.657 2 DEBUG nova.virt.disk.api [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.658 2 DEBUG nova.objects.instance [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid 2db1c454-9505-4c8b-aeff-f0bba892690f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.687 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.687 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Ensure instance console log exists: /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.688 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.688 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:46 compute-0 nova_compute[192810]: 2025-09-30 21:38:46.689 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:47 compute-0 nova_compute[192810]: 2025-09-30 21:38:47.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:47 compute-0 nova_compute[192810]: 2025-09-30 21:38:47.906 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Successfully created port: a496b9c3-8b65-4657-95a2-9397df87ab4d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:38:48 compute-0 nova_compute[192810]: 2025-09-30 21:38:48.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.070 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Successfully updated port: a496b9c3-8b65-4657-95a2-9397df87ab4d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.098 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.098 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.099 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.202 2 DEBUG nova.compute.manager [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-changed-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.202 2 DEBUG nova.compute.manager [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Refreshing instance network info cache due to event network-changed-a496b9c3-8b65-4657-95a2-9397df87ab4d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.203 2 DEBUG oslo_concurrency.lockutils [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.434 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:49 compute-0 unix_chkpwd[238535]: password check failed for user (root)
Sep 30 21:38:49 compute-0 sshd-session[238533]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:38:49 compute-0 nova_compute[192810]: 2025-09-30 21:38:49.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.161 2 DEBUG nova.network.neutron [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updating instance_info_cache with network_info: [{"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.191 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.191 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Instance network_info: |[{"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.192 2 DEBUG oslo_concurrency.lockutils [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.192 2 DEBUG nova.network.neutron [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Refreshing network info cache for port a496b9c3-8b65-4657-95a2-9397df87ab4d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.194 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Start _get_guest_xml network_info=[{"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.199 2 WARNING nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.206 2 DEBUG nova.virt.libvirt.host [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.207 2 DEBUG nova.virt.libvirt.host [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.209 2 DEBUG nova.virt.libvirt.host [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.210 2 DEBUG nova.virt.libvirt.host [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.210 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.211 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.211 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.211 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.211 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.212 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.212 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.212 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.212 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.212 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.213 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.213 2 DEBUG nova.virt.hardware [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.216 2 DEBUG nova.virt.libvirt.vif [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1784817975',display_name='tempest-ServersTestJSON-server-1784817975',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1784817975',id=122,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGvbtweNmFxts2/tslAymhodRAIe1w+LJ2OaWGMLWl5s/fQt03GjecTpwMUZqd5bDeiGJP7PxYtdxdbKI5U2h4I8kTyRzq5YQRRqGvLtlhlbtUh6kjFsgMZ4R39tnRXIPw==',key_name='tempest-key-846567156',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6b319e6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:46Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=2db1c454-9505-4c8b-aeff-f0bba892690f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.216 2 DEBUG nova.network.os_vif_util [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.217 2 DEBUG nova.network.os_vif_util [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.218 2 DEBUG nova.objects.instance [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2db1c454-9505-4c8b-aeff-f0bba892690f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.247 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <uuid>2db1c454-9505-4c8b-aeff-f0bba892690f</uuid>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <name>instance-0000007a</name>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersTestJSON-server-1784817975</nova:name>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:38:51</nova:creationTime>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         <nova:port uuid="a496b9c3-8b65-4657-95a2-9397df87ab4d">
Sep 30 21:38:51 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <system>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="serial">2db1c454-9505-4c8b-aeff-f0bba892690f</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="uuid">2db1c454-9505-4c8b-aeff-f0bba892690f</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </system>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <os>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </os>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <features>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </features>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.config"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:3f:86:9e"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <target dev="tapa496b9c3-8b"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/console.log" append="off"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <video>
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </video>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:38:51 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:38:51 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:38:51 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:38:51 compute-0 nova_compute[192810]: </domain>
Sep 30 21:38:51 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.248 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Preparing to wait for external event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.248 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.249 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.249 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.250 2 DEBUG nova.virt.libvirt.vif [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1784817975',display_name='tempest-ServersTestJSON-server-1784817975',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1784817975',id=122,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGvbtweNmFxts2/tslAymhodRAIe1w+LJ2OaWGMLWl5s/fQt03GjecTpwMUZqd5bDeiGJP7PxYtdxdbKI5U2h4I8kTyRzq5YQRRqGvLtlhlbtUh6kjFsgMZ4R39tnRXIPw==',key_name='tempest-key-846567156',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6b319e6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:46Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=2db1c454-9505-4c8b-aeff-f0bba892690f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.250 2 DEBUG nova.network.os_vif_util [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.250 2 DEBUG nova.network.os_vif_util [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.251 2 DEBUG os_vif [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa496b9c3-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.254 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa496b9c3-8b, col_values=(('external_ids', {'iface-id': 'a496b9c3-8b65-4657-95a2-9397df87ab4d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:86:9e', 'vm-uuid': '2db1c454-9505-4c8b-aeff-f0bba892690f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:51 compute-0 NetworkManager[51733]: <info>  [1759268331.2567] manager: (tapa496b9c3-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.262 2 INFO os_vif [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b')
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.405 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.405 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.406 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:3f:86:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:38:51 compute-0 nova_compute[192810]: 2025-09-30 21:38:51.406 2 INFO nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Using config drive
Sep 30 21:38:52 compute-0 sshd-session[238533]: Failed password for root from 45.81.23.80 port 36952 ssh2
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.255 2 INFO nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Creating config drive at /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.config
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.260 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnixbrkvg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.397 2 DEBUG oslo_concurrency.processutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnixbrkvg" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:52 compute-0 kernel: tapa496b9c3-8b: entered promiscuous mode
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.4540] manager: (tapa496b9c3-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Sep 30 21:38:52 compute-0 ovn_controller[94912]: 2025-09-30T21:38:52Z|00453|binding|INFO|Claiming lport a496b9c3-8b65-4657-95a2-9397df87ab4d for this chassis.
Sep 30 21:38:52 compute-0 ovn_controller[94912]: 2025-09-30T21:38:52Z|00454|binding|INFO|a496b9c3-8b65-4657-95a2-9397df87ab4d: Claiming fa:16:3e:3f:86:9e 10.100.0.10
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.468 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:86:9e 10.100.0.10'], port_security=['fa:16:3e:3f:86:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2db1c454-9505-4c8b-aeff-f0bba892690f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a496b9c3-8b65-4657-95a2-9397df87ab4d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.469 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a496b9c3-8b65-4657-95a2-9397df87ab4d in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:38:52 compute-0 ovn_controller[94912]: 2025-09-30T21:38:52Z|00455|binding|INFO|Setting lport a496b9c3-8b65-4657-95a2-9397df87ab4d ovn-installed in OVS
Sep 30 21:38:52 compute-0 ovn_controller[94912]: 2025-09-30T21:38:52Z|00456|binding|INFO|Setting lport a496b9c3-8b65-4657-95a2-9397df87ab4d up in Southbound
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.471 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.482 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ae03aba6-b5fc-4eca-9cd7-ac2455b90c11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.482 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27086519-61 in ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.484 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27086519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.484 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[308f0354-f351-4897-ac94-ff85f0b5a37f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.485 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[900ef4fa-d5f1-4f77-9ab6-8364ab00bf0c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 systemd-udevd[238558]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:52 compute-0 systemd-machined[152794]: New machine qemu-58-instance-0000007a.
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.494 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b4e12e64-be95-48c7-81af-ce7c92875e0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.5012] device (tapa496b9c3-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.5025] device (tapa496b9c3-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:38:52 compute-0 systemd[1]: Started Virtual Machine qemu-58-instance-0000007a.
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.520 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb14677-2030-4817-87b8-97db1fc09777]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.548 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[962b8580-3328-4bd8-b829-7ef730bed3c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.552 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1aa7f99f-5002-4072-97db-3dcfbde81125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 systemd-udevd[238562]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.5550] manager: (tap27086519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.582 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1d47f9e3-89da-4f7b-8307-cba0d9553830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.585 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1f717b46-09e3-4e01-bed5-c0bdcfdddced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.592 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.593 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.6103] device (tap27086519-60): carrier: link connected
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.616 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0fe8b1d2-03f7-401f-bfc0-bcc85b2ddced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.623 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.640 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc64b3c-a8d4-4a5c-8d4c-56635e7bc57c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502823, 'reachable_time': 34600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238590, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.663 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7338076c-72f3-4b50-812d-49c1075e87ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:b9e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 502823, 'tstamp': 502823}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238592, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.681 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[213afdae-81e5-4730-bb69-9a36390f1eeb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 136], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502823, 'reachable_time': 34600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238597, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.718 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ac67c146-5adf-43e6-97a9-ee8fcf61db6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.740 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.741 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.750 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.750 2 INFO nova.compute.claims [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.792 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cdcf74b2-5524-4da4-a311-2c53b3f71acf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.793 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.793 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.794 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:52 compute-0 kernel: tap27086519-60: entered promiscuous mode
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 NetworkManager[51733]: <info>  [1759268332.7966] manager: (tap27086519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.800 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 ovn_controller[94912]: 2025-09-30T21:38:52Z|00457|binding|INFO|Releasing lport f2abb4ad-797b-4767-b8bc-377990516394 from this chassis (sb_readonly=0)
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.803 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.804 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa3d195-4286-4039-b65f-741fa71e8c3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.805 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:38:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:52.806 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'env', 'PROCESS_TAG=haproxy-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27086519-6f4c-45f9-8e5b-5b321cd6871c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.899 2 DEBUG nova.compute.provider_tree [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.918 2 DEBUG nova.scheduler.client.report [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.969 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:52 compute-0 nova_compute[192810]: 2025-09-30 21:38:52.970 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.088 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.089 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.120 2 INFO nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.149 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.171 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268333.1711876, 2db1c454-9505-4c8b-aeff-f0bba892690f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.172 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] VM Started (Lifecycle Event)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.182 2 DEBUG nova.compute.manager [req-4e4e3c13-508e-4b9d-8f25-96835d63d2cd req-fad3dfa3-84e7-4f51-ae27-5cd6c7851fa9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.183 2 DEBUG oslo_concurrency.lockutils [req-4e4e3c13-508e-4b9d-8f25-96835d63d2cd req-fad3dfa3-84e7-4f51-ae27-5cd6c7851fa9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.183 2 DEBUG oslo_concurrency.lockutils [req-4e4e3c13-508e-4b9d-8f25-96835d63d2cd req-fad3dfa3-84e7-4f51-ae27-5cd6c7851fa9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.183 2 DEBUG oslo_concurrency.lockutils [req-4e4e3c13-508e-4b9d-8f25-96835d63d2cd req-fad3dfa3-84e7-4f51-ae27-5cd6c7851fa9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.184 2 DEBUG nova.compute.manager [req-4e4e3c13-508e-4b9d-8f25-96835d63d2cd req-fad3dfa3-84e7-4f51-ae27-5cd6c7851fa9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Processing event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.186 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.190 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.193 2 INFO nova.virt.libvirt.driver [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Instance spawned successfully.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.193 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:38:53 compute-0 podman[238631]: 2025-09-30 21:38:53.209541823 +0000 UTC m=+0.059060043 container create 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.225 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.229 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.239 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.239 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.240 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.240 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.241 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.241 2 DEBUG nova.virt.libvirt.driver [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:38:53 compute-0 systemd[1]: Started libpod-conmon-37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638.scope.
Sep 30 21:38:53 compute-0 podman[238631]: 2025-09-30 21:38:53.174202222 +0000 UTC m=+0.023720472 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:38:53 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:38:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31a5798e966bcfb4e031fa73c03938d22f09c879907afe19bc8f3095075b84c5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.281 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.282 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268333.171315, 2db1c454-9505-4c8b-aeff-f0bba892690f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.283 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] VM Paused (Lifecycle Event)
Sep 30 21:38:53 compute-0 podman[238631]: 2025-09-30 21:38:53.285739292 +0000 UTC m=+0.135257542 container init 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:38:53 compute-0 podman[238631]: 2025-09-30 21:38:53.290953762 +0000 UTC m=+0.140471982 container start 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 21:38:53 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [NOTICE]   (238650) : New worker (238652) forked
Sep 30 21:38:53 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [NOTICE]   (238650) : Loading success.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.399 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.403 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268333.1891074, 2db1c454-9505-4c8b-aeff-f0bba892690f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.403 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] VM Resumed (Lifecycle Event)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.427 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.431 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.448 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.450 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.450 2 INFO nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Creating image(s)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.450 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.451 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.451 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.463 2 INFO nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Took 7.29 seconds to spawn the instance on the hypervisor.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.464 2 DEBUG nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.464 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.465 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.483 2 DEBUG nova.network.neutron [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updated VIF entry in instance network info cache for port a496b9c3-8b65-4657-95a2-9397df87ab4d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.483 2 DEBUG nova.network.neutron [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updating instance_info_cache with network_info: [{"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.486 2 DEBUG nova.policy [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b16febbab784cc7b40a988decaa1db8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c5899f27a1e496da636c4e31f6867c6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.511 2 DEBUG oslo_concurrency.lockutils [req-2bb31705-91f7-47bf-9cd7-659a969fbb9a req-f6d512b9-fdd6-44d7-8e23-f8636f17923b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.518 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.519 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.519 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.529 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:53 compute-0 sshd-session[238533]: Received disconnect from 45.81.23.80 port 36952:11: Bye Bye [preauth]
Sep 30 21:38:53 compute-0 sshd-session[238533]: Disconnected from authenticating user root 45.81.23.80 port 36952 [preauth]
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.568 2 INFO nova.compute.manager [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Took 7.94 seconds to build instance.
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.582 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.583 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.602 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268318.5984588, e9e8dea0-93b5-463c-92da-bdac362c3cd9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.603 2 INFO nova.compute.manager [-] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] VM Stopped (Lifecycle Event)
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.611 2 DEBUG oslo_concurrency.lockutils [None req-709fd4cb-1c63-49e9-b37d-04454c4fb7ad 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.617 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.617 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.618 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.637 2 DEBUG nova.compute.manager [None req-e5f08286-ed41-4b4c-b624-e25ea592f06d - - - - - -] [instance: e9e8dea0-93b5-463c-92da-bdac362c3cd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.675 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.676 2 DEBUG nova.virt.disk.api [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Checking if we can resize image /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.676 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.731 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.732 2 DEBUG nova.virt.disk.api [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Cannot resize image /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.732 2 DEBUG nova.objects.instance [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'migration_context' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.759 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.759 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Ensure instance console log exists: /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.760 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.761 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.761 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:53 compute-0 nova_compute[192810]: 2025-09-30 21:38:53.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:54 compute-0 nova_compute[192810]: 2025-09-30 21:38:54.630 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Successfully created port: e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:38:55 compute-0 nova_compute[192810]: 2025-09-30 21:38:55.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:55 compute-0 nova_compute[192810]: 2025-09-30 21:38:55.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:55 compute-0 nova_compute[192810]: 2025-09-30 21:38:55.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:38:55 compute-0 nova_compute[192810]: 2025-09-30 21:38:55.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:38:55 compute-0 nova_compute[192810]: 2025-09-30 21:38:55.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.109 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.109 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.109 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.110 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2db1c454-9505-4c8b-aeff-f0bba892690f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.137 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Successfully updated port: e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.155 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.155 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquired lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.155 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.247 2 DEBUG nova.compute.manager [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.248 2 DEBUG oslo_concurrency.lockutils [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.248 2 DEBUG oslo_concurrency.lockutils [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.248 2 DEBUG oslo_concurrency.lockutils [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.248 2 DEBUG nova.compute.manager [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] No waiting events found dispatching network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.249 2 WARNING nova.compute.manager [req-ebd927fb-4f4d-49ba-8ad2-0ae30490a7e3 req-c619f439-d2cb-48c1-8c85-779a10e17ba0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received unexpected event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d for instance with vm_state active and task_state None.
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:56 compute-0 nova_compute[192810]: 2025-09-30 21:38:56.326 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.378 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.378 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.379 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.379 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.379 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.392 2 INFO nova.compute.manager [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Terminating instance
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.402 2 DEBUG nova.compute.manager [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:38:57 compute-0 kernel: tapa496b9c3-8b (unregistering): left promiscuous mode
Sep 30 21:38:57 compute-0 NetworkManager[51733]: <info>  [1759268337.4193] device (tapa496b9c3-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:38:57 compute-0 ovn_controller[94912]: 2025-09-30T21:38:57Z|00458|binding|INFO|Releasing lport a496b9c3-8b65-4657-95a2-9397df87ab4d from this chassis (sb_readonly=0)
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 ovn_controller[94912]: 2025-09-30T21:38:57Z|00459|binding|INFO|Setting lport a496b9c3-8b65-4657-95a2-9397df87ab4d down in Southbound
Sep 30 21:38:57 compute-0 ovn_controller[94912]: 2025-09-30T21:38:57Z|00460|binding|INFO|Removing iface tapa496b9c3-8b ovn-installed in OVS
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.442 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:86:9e 10.100.0.10'], port_security=['fa:16:3e:3f:86:9e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '2db1c454-9505-4c8b-aeff-f0bba892690f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=a496b9c3-8b65-4657-95a2-9397df87ab4d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.444 103867 INFO neutron.agent.ovn.metadata.agent [-] Port a496b9c3-8b65-4657-95a2-9397df87ab4d in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.446 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27086519-6f4c-45f9-8e5b-5b321cd6871c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.448 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89d0fd1e-cb0c-47d9-b8d4-73bfd52f3358]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.449 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace which is not needed anymore
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007a.scope: Deactivated successfully.
Sep 30 21:38:57 compute-0 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007a.scope: Consumed 4.860s CPU time.
Sep 30 21:38:57 compute-0 systemd-machined[152794]: Machine qemu-58-instance-0000007a terminated.
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [NOTICE]   (238650) : haproxy version is 2.8.14-c23fe91
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [NOTICE]   (238650) : path to executable is /usr/sbin/haproxy
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [WARNING]  (238650) : Exiting Master process...
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [WARNING]  (238650) : Exiting Master process...
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [ALERT]    (238650) : Current worker (238652) exited with code 143 (Terminated)
Sep 30 21:38:57 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[238646]: [WARNING]  (238650) : All workers exited. Exiting... (0)
Sep 30 21:38:57 compute-0 systemd[1]: libpod-37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638.scope: Deactivated successfully.
Sep 30 21:38:57 compute-0 podman[238700]: 2025-09-30 21:38:57.570786256 +0000 UTC m=+0.042999663 container died 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.581 2 DEBUG nova.network.neutron [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updating instance_info_cache with network_info: [{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638-userdata-shm.mount: Deactivated successfully.
Sep 30 21:38:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-31a5798e966bcfb4e031fa73c03938d22f09c879907afe19bc8f3095075b84c5-merged.mount: Deactivated successfully.
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.634 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Releasing lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.635 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance network_info: |[{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.636 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start _get_guest_xml network_info=[{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.645 2 WARNING nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.654 2 DEBUG nova.virt.libvirt.host [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.655 2 DEBUG nova.virt.libvirt.host [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.658 2 DEBUG nova.virt.libvirt.host [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.659 2 DEBUG nova.virt.libvirt.host [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.660 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.660 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.660 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.661 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.661 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.661 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.661 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.661 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.662 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.662 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.662 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.662 2 DEBUG nova.virt.hardware [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.665 2 DEBUG nova.virt.libvirt.vif [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1575903998',display_name='tempest-ServerRescueTestJSON-server-1575903998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1575903998',id=124,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c5899f27a1e496da636c4e31f6867c6',ramdisk_id='',reservation_id='r-605fltlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1530357589',owner_user_name='tempest-ServerRescueTestJSON-1530357589-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:53Z,user_data=None,user_id='0b16febbab784cc7b40a988decaa1db8',uuid=99707edc-d882-4127-bc26-1fea0a52f9da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.666 2 DEBUG nova.network.os_vif_util [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converting VIF {"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.666 2 DEBUG nova.network.os_vif_util [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.667 2 DEBUG nova.objects.instance [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.668 2 INFO nova.virt.libvirt.driver [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Instance destroyed successfully.
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.669 2 DEBUG nova.objects.instance [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid 2db1c454-9505-4c8b-aeff-f0bba892690f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 podman[238700]: 2025-09-30 21:38:57.681657439 +0000 UTC m=+0.153870846 container cleanup 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.686 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <uuid>99707edc-d882-4127-bc26-1fea0a52f9da</uuid>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <name>instance-0000007c</name>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerRescueTestJSON-server-1575903998</nova:name>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:38:57</nova:creationTime>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:user uuid="0b16febbab784cc7b40a988decaa1db8">tempest-ServerRescueTestJSON-1530357589-project-member</nova:user>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:project uuid="9c5899f27a1e496da636c4e31f6867c6">tempest-ServerRescueTestJSON-1530357589</nova:project>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         <nova:port uuid="e9ffeb79-64b3-4a68-93a9-3d612ba40eb5">
Sep 30 21:38:57 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <system>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="serial">99707edc-d882-4127-bc26-1fea0a52f9da</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="uuid">99707edc-d882-4127-bc26-1fea0a52f9da</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </system>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <os>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </os>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <features>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </features>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:e9:08:4f"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <target dev="tape9ffeb79-64"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/console.log" append="off"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <video>
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </video>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:38:57 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:38:57 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:38:57 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:38:57 compute-0 nova_compute[192810]: </domain>
Sep 30 21:38:57 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.687 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Preparing to wait for external event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.687 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.687 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.688 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:57 compute-0 systemd[1]: libpod-conmon-37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638.scope: Deactivated successfully.
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.689 2 DEBUG nova.virt.libvirt.vif [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1575903998',display_name='tempest-ServerRescueTestJSON-server-1575903998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1575903998',id=124,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9c5899f27a1e496da636c4e31f6867c6',ramdisk_id='',reservation_id='r-605fltlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-1530357589',owner_user_name='tempest-ServerRescueTestJSON-1530357589-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:38:53Z,user_data=None,user_id='0b16febbab784cc7b40a988decaa1db8',uuid=99707edc-d882-4127-bc26-1fea0a52f9da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.689 2 DEBUG nova.network.os_vif_util [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converting VIF {"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.690 2 DEBUG nova.network.os_vif_util [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.690 2 DEBUG os_vif [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.691 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.691 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9ffeb79-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9ffeb79-64, col_values=(('external_ids', {'iface-id': 'e9ffeb79-64b3-4a68-93a9-3d612ba40eb5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:08:4f', 'vm-uuid': '99707edc-d882-4127-bc26-1fea0a52f9da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 NetworkManager[51733]: <info>  [1759268337.6953] manager: (tape9ffeb79-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/203)
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.699 2 DEBUG nova.virt.libvirt.vif [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1784817975',display_name='tempest-ServersTestJSON-server-1784817975',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1784817975',id=122,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGvbtweNmFxts2/tslAymhodRAIe1w+LJ2OaWGMLWl5s/fQt03GjecTpwMUZqd5bDeiGJP7PxYtdxdbKI5U2h4I8kTyRzq5YQRRqGvLtlhlbtUh6kjFsgMZ4R39tnRXIPw==',key_name='tempest-key-846567156',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:38:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6b319e6a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:38:53Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=2db1c454-9505-4c8b-aeff-f0bba892690f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.700 2 DEBUG nova.network.os_vif_util [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.701 2 DEBUG nova.network.os_vif_util [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.701 2 DEBUG os_vif [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.704 2 INFO os_vif [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64')
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.705 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa496b9c3-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.713 2 INFO os_vif [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:86:9e,bridge_name='br-int',has_traffic_filtering=True,id=a496b9c3-8b65-4657-95a2-9397df87ab4d,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa496b9c3-8b')
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.714 2 INFO nova.virt.libvirt.driver [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Deleting instance files /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f_del
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.714 2 INFO nova.virt.libvirt.driver [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Deletion of /var/lib/nova/instances/2db1c454-9505-4c8b-aeff-f0bba892690f_del complete
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.776 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.776 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.776 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No VIF found with MAC fa:16:3e:e9:08:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.777 2 INFO nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Using config drive
Sep 30 21:38:57 compute-0 podman[238747]: 2025-09-30 21:38:57.792991484 +0000 UTC m=+0.092233200 container remove 37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.798 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ec002372-43c8-4378-9d91-2957f1a8b72a]: (4, ('Tue Sep 30 09:38:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638)\n37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638\nTue Sep 30 09:38:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638)\n37c543a4220b9ced917511fdd65363bbe1728f5252f81d0ddd83af5299c41638\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.799 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff0d7ae-55f3-4d01-a433-04c51b85c2c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.800 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:38:57 compute-0 kernel: tap27086519-60: left promiscuous mode
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.814 2 INFO nova.compute.manager [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.815 2 DEBUG oslo.service.loopingcall [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.815 2 DEBUG nova.compute.manager [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.815 2 DEBUG nova.network.neutron [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.820 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6a42de18-915a-4cfb-a50f-567717702739]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.851 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c56e31e4-eb60-432d-bd36-f028d7bc2350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.852 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7beedef8-7629-4fc0-95e8-b1d3d4f7a7ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.866 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e012bb0a-f8d3-4e21-857d-f0e7e9d6b1ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 502816, 'reachable_time': 43310, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238770, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d27086519\x2d6f4c\x2d45f9\x2d8e5b\x2d5b321cd6871c.mount: Deactivated successfully.
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.868 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:38:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:57.869 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a1d4412e-d408-4653-bbf8-8981cb29e844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.910 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updating instance_info_cache with network_info: [{"id": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "address": "fa:16:3e:3f:86:9e", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa496b9c3-8b", "ovs_interfaceid": "a496b9c3-8b65-4657-95a2-9397df87ab4d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.924 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-2db1c454-9505-4c8b-aeff-f0bba892690f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.925 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.925 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.925 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.925 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.947 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.948 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.948 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:57 compute-0 nova_compute[192810]: 2025-09-30 21:38:57.948 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.020 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.094 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.095 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.155 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.157 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-0000007c, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config'
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.279 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.281 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5689MB free_disk=73.25006484985352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.281 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.281 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.362 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 2db1c454-9505-4c8b-aeff-f0bba892690f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.363 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 99707edc-d882-4127-bc26-1fea0a52f9da actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.363 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.363 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.397 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-changed-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.397 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Refreshing instance network info cache due to event network-changed-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.397 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.398 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.398 2 DEBUG nova.network.neutron [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Refreshing network info cache for port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.409 2 INFO nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Creating config drive at /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.413 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw9rti7ft execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.484 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.511 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.536 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.536 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.542 2 DEBUG oslo_concurrency.processutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw9rti7ft" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:38:58 compute-0 kernel: tape9ffeb79-64: entered promiscuous mode
Sep 30 21:38:58 compute-0 NetworkManager[51733]: <info>  [1759268338.6001] manager: (tape9ffeb79-64): new Tun device (/org/freedesktop/NetworkManager/Devices/204)
Sep 30 21:38:58 compute-0 systemd-udevd[238680]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:38:58 compute-0 ovn_controller[94912]: 2025-09-30T21:38:58Z|00461|binding|INFO|Claiming lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for this chassis.
Sep 30 21:38:58 compute-0 ovn_controller[94912]: 2025-09-30T21:38:58Z|00462|binding|INFO|e9ffeb79-64b3-4a68-93a9-3d612ba40eb5: Claiming fa:16:3e:e9:08:4f 10.100.0.8
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:58 compute-0 NetworkManager[51733]: <info>  [1759268338.6128] device (tape9ffeb79-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:38:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:58.612 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:38:58 compute-0 NetworkManager[51733]: <info>  [1759268338.6135] device (tape9ffeb79-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:38:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:58.614 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 bound to our chassis
Sep 30 21:38:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:58.614 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:38:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:38:58.615 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad281e1-4e0e-4ab5-b453-27d598a07a6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.619 2 DEBUG nova.network.neutron [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:58 compute-0 systemd-machined[152794]: New machine qemu-59-instance-0000007c.
Sep 30 21:38:58 compute-0 ovn_controller[94912]: 2025-09-30T21:38:58Z|00463|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 ovn-installed in OVS
Sep 30 21:38:58 compute-0 ovn_controller[94912]: 2025-09-30T21:38:58Z|00464|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 up in Southbound
Sep 30 21:38:58 compute-0 systemd[1]: Started Virtual Machine qemu-59-instance-0000007c.
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.779 2 INFO nova.compute.manager [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Took 0.96 seconds to deallocate network for instance.
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.859 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.860 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.914 2 DEBUG nova.compute.provider_tree [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.930 2 DEBUG nova.scheduler.client.report [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.949 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:58 compute-0 nova_compute[192810]: 2025-09-30 21:38:58.980 2 INFO nova.scheduler.client.report [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance 2db1c454-9505-4c8b-aeff-f0bba892690f
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.081 2 DEBUG oslo_concurrency.lockutils [None req-98bf349c-c113-4449-b8a2-04ba951ec782 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.702s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.469 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268339.4687564, 99707edc-d882-4127-bc26-1fea0a52f9da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.469 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Started (Lifecycle Event)
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.492 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.496 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268339.4697654, 99707edc-d882-4127-bc26-1fea0a52f9da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.496 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Paused (Lifecycle Event)
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.511 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.514 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.532 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.816 2 DEBUG nova.network.neutron [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updated VIF entry in instance network info cache for port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.817 2 DEBUG nova.network.neutron [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updating instance_info_cache with network_info: [{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.843 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.843 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-unplugged-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.844 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.844 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.844 2 DEBUG oslo_concurrency.lockutils [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.844 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] No waiting events found dispatching network-vif-unplugged-a496b9c3-8b65-4657-95a2-9397df87ab4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:38:59 compute-0 nova_compute[192810]: 2025-09-30 21:38:59.845 2 DEBUG nova.compute.manager [req-c617a8c3-a23b-459c-a2d6-1a02f6064ceb req-5605f78d-3781-4526-a552-d794b17d3b8e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-unplugged-a496b9c3-8b65-4657-95a2-9397df87ab4d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.398 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.542 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.542 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.542 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.542 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2db1c454-9505-4c8b-aeff-f0bba892690f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.543 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] No waiting events found dispatching network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.543 2 WARNING nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received unexpected event network-vif-plugged-a496b9c3-8b65-4657-95a2-9397df87ab4d for instance with vm_state deleted and task_state None.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.543 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Received event network-vif-deleted-a496b9c3-8b65-4657-95a2-9397df87ab4d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.543 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.544 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.544 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.544 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.544 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Processing event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.544 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.545 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.545 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.545 2 DEBUG oslo_concurrency.lockutils [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.545 2 DEBUG nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.545 2 WARNING nova.compute.manager [req-7c4dd0a1-c053-47f5-83b3-39442a994234 req-7321c1c7-b090-4e63-88c8-bdd522d3e1e5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state building and task_state spawning.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.546 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.549 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268340.5490375, 99707edc-d882-4127-bc26-1fea0a52f9da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.549 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Resumed (Lifecycle Event)
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.550 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.553 2 INFO nova.virt.libvirt.driver [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance spawned successfully.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.553 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.568 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.572 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.572 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.573 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.573 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.574 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.574 2 DEBUG nova.virt.libvirt.driver [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.577 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.609 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.678 2 INFO nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Took 7.23 seconds to spawn the instance on the hypervisor.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.679 2 DEBUG nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.777 2 INFO nova.compute.manager [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Took 8.07 seconds to build instance.
Sep 30 21:39:00 compute-0 nova_compute[192810]: 2025-09-30 21:39:00.795 2 DEBUG oslo_concurrency.lockutils [None req-969e265d-3d48-4d2b-876a-ba7d3c572a05 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:01 compute-0 podman[238809]: 2025-09-30 21:39:01.327587083 +0000 UTC m=+0.054238253 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:01 compute-0 podman[238810]: 2025-09-30 21:39:01.330128856 +0000 UTC m=+0.055559425 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:39:01 compute-0 podman[238808]: 2025-09-30 21:39:01.360435722 +0000 UTC m=+0.089849941 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:39:01 compute-0 nova_compute[192810]: 2025-09-30 21:39:01.477 2 INFO nova.compute.manager [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Rescuing
Sep 30 21:39:01 compute-0 nova_compute[192810]: 2025-09-30 21:39:01.477 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:01 compute-0 nova_compute[192810]: 2025-09-30 21:39:01.478 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquired lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:01 compute-0 nova_compute[192810]: 2025-09-30 21:39:01.478 2 DEBUG nova.network.neutron [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:01 compute-0 nova_compute[192810]: 2025-09-30 21:39:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.536 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.537 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.560 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.635 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.636 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.653 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.654 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.655 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.663 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.664 2 INFO nova.compute.claims [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.751 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.755 2 DEBUG nova.network.neutron [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updating instance_info_cache with network_info: [{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.770 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Releasing lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.833 2 DEBUG nova.compute.provider_tree [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.859 2 DEBUG nova.scheduler.client.report [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.880 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.880 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.882 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.888 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.889 2 INFO nova.compute.claims [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.950 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.951 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.971 2 INFO nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:39:02 compute-0 nova_compute[192810]: 2025-09-30 21:39:02.991 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.038 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.076 2 DEBUG nova.compute.provider_tree [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.106 2 DEBUG nova.scheduler.client.report [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.158 2 DEBUG nova.policy [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '972edc166c9442b1a83983d15a64e8b6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '51c02ace4fff44cca028986381d7c407', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.473 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.474 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.478 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.479 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.480 2 INFO nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Creating image(s)
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.480 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.480 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.481 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.493 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.546 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.546 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.556 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.557 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.557 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.569 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.590 2 INFO nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.609 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.627 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.628 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.661 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.662 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.663 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.745 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.745 2 DEBUG nova.virt.disk.api [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Checking if we can resize image /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.746 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.767 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.769 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.769 2 INFO nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Creating image(s)
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.770 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.770 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.771 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.783 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.806 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.806 2 DEBUG nova.virt.disk.api [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Cannot resize image /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.807 2 DEBUG nova.objects.instance [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'migration_context' on Instance uuid 95188da8-afd4-4fbc-9231-7547201f81ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.821 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.822 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Ensure instance console log exists: /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.822 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.822 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.823 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.837 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.837 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.838 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.849 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.869 2 DEBUG nova.policy [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.914 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:03 compute-0 nova_compute[192810]: 2025-09-30 21:39:03.915 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.125 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk 1073741824" returned: 0 in 0.210s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.129 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.129 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.162 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Successfully created port: 87924781-e008-4aac-b813-a3d0f3d55f51 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.180 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.180 2 DEBUG nova.virt.disk.api [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.181 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.244 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.245 2 DEBUG nova.virt.disk.api [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.245 2 DEBUG nova.objects.instance [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0b1805eb-abb7-44a6-acd9-45548efbe439 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.260 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.261 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Ensure instance console log exists: /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.261 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.262 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.262 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.477 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Successfully created port: 77592b52-fef4-4ec1-9e04-78fe87759f03 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.841 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Successfully updated port: 87924781-e008-4aac-b813-a3d0f3d55f51 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.871 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.871 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquired lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:04 compute-0 nova_compute[192810]: 2025-09-30 21:39:04.872 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.076 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.642 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Successfully updated port: 77592b52-fef4-4ec1-9e04-78fe87759f03 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.659 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.661 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.662 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.749 2 DEBUG nova.compute.manager [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-changed-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.749 2 DEBUG nova.compute.manager [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Refreshing instance network info cache due to event network-changed-77592b52-fef4-4ec1-9e04-78fe87759f03. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.750 2 DEBUG oslo_concurrency.lockutils [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.815 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.834 2 DEBUG nova.network.neutron [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Updating instance_info_cache with network_info: [{"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.852 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Releasing lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.853 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Instance network_info: |[{"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.855 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Start _get_guest_xml network_info=[{"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.858 2 WARNING nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.862 2 DEBUG nova.virt.libvirt.host [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.863 2 DEBUG nova.virt.libvirt.host [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.865 2 DEBUG nova.virt.libvirt.host [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.866 2 DEBUG nova.virt.libvirt.host [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.867 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.867 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.868 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.868 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.868 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.869 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.869 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.869 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.869 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.870 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.870 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.870 2 DEBUG nova.virt.hardware [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.873 2 DEBUG nova.virt.libvirt.vif [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1112694235',display_name='tempest-ServersNegativeTestJSON-server-1112694235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1112694235',id=126,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-7810uyz9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:03Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=95188da8-afd4-4fbc-9231-7547201f81ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.874 2 DEBUG nova.network.os_vif_util [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.875 2 DEBUG nova.network.os_vif_util [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.875 2 DEBUG nova.objects.instance [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95188da8-afd4-4fbc-9231-7547201f81ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.878 2 DEBUG nova.compute.manager [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-changed-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.879 2 DEBUG nova.compute.manager [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Refreshing instance network info cache due to event network-changed-87924781-e008-4aac-b813-a3d0f3d55f51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.879 2 DEBUG oslo_concurrency.lockutils [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.879 2 DEBUG oslo_concurrency.lockutils [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.879 2 DEBUG nova.network.neutron [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Refreshing network info cache for port 87924781-e008-4aac-b813-a3d0f3d55f51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.901 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <uuid>95188da8-afd4-4fbc-9231-7547201f81ed</uuid>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <name>instance-0000007e</name>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersNegativeTestJSON-server-1112694235</nova:name>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:39:05</nova:creationTime>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:user uuid="972edc166c9442b1a83983d15a64e8b6">tempest-ServersNegativeTestJSON-2038911265-project-member</nova:user>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:project uuid="51c02ace4fff44cca028986381d7c407">tempest-ServersNegativeTestJSON-2038911265</nova:project>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         <nova:port uuid="87924781-e008-4aac-b813-a3d0f3d55f51">
Sep 30 21:39:05 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <system>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="serial">95188da8-afd4-4fbc-9231-7547201f81ed</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="uuid">95188da8-afd4-4fbc-9231-7547201f81ed</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </system>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <os>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </os>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <features>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </features>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.config"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:79:e6:e9"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <target dev="tap87924781-e0"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/console.log" append="off"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <video>
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </video>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:39:05 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:39:05 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:39:05 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:39:05 compute-0 nova_compute[192810]: </domain>
Sep 30 21:39:05 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.913 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Preparing to wait for external event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.914 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.914 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.919 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.920 2 DEBUG nova.virt.libvirt.vif [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1112694235',display_name='tempest-ServersNegativeTestJSON-server-1112694235',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1112694235',id=126,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-7810uyz9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:03Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=95188da8-afd4-4fbc-9231-7547201f81ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.921 2 DEBUG nova.network.os_vif_util [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.921 2 DEBUG nova.network.os_vif_util [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.922 2 DEBUG os_vif [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.923 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.928 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87924781-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.928 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87924781-e0, col_values=(('external_ids', {'iface-id': '87924781-e008-4aac-b813-a3d0f3d55f51', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:e6:e9', 'vm-uuid': '95188da8-afd4-4fbc-9231-7547201f81ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:05 compute-0 NetworkManager[51733]: <info>  [1759268345.9312] manager: (tap87924781-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.938 2 INFO os_vif [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0')
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.995 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.995 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.996 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] No VIF found with MAC fa:16:3e:79:e6:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:39:05 compute-0 nova_compute[192810]: 2025-09-30 21:39:05.996 2 INFO nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Using config drive
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.389 2 INFO nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Creating config drive at /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.config
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.393 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpektgx_53 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.516 2 DEBUG oslo_concurrency.processutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpektgx_53" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:06 compute-0 kernel: tap87924781-e0: entered promiscuous mode
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.5842] manager: (tap87924781-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/206)
Sep 30 21:39:06 compute-0 ovn_controller[94912]: 2025-09-30T21:39:06Z|00465|binding|INFO|Claiming lport 87924781-e008-4aac-b813-a3d0f3d55f51 for this chassis.
Sep 30 21:39:06 compute-0 ovn_controller[94912]: 2025-09-30T21:39:06Z|00466|binding|INFO|87924781-e008-4aac-b813-a3d0f3d55f51: Claiming fa:16:3e:79:e6:e9 10.100.0.6
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.601 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:e6:e9 10.100.0.6'], port_security=['fa:16:3e:79:e6:e9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '95188da8-afd4-4fbc-9231-7547201f81ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51c02ace4fff44cca028986381d7c407', 'neutron:revision_number': '2', 'neutron:security_group_ids': '62973e02-a61c-4061-8a33-46cf1b8a21f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6f1e456-6208-4d0c-8c96-5e3a3d932af6, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=87924781-e008-4aac-b813-a3d0f3d55f51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.603 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 87924781-e008-4aac-b813-a3d0f3d55f51 in datapath ea80450e-b8f2-4af5-a00d-9221e5dd4d97 bound to our chassis
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.605 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:39:06 compute-0 systemd-udevd[238915]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.618 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9539bba3-362e-4a43-b872-0beb9377ae0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.619 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapea80450e-b1 in ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.622 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapea80450e-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.622 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9de64e2a-abdf-4e9b-8999-25fbe7d6db55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.625 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4348973e-4e81-4bf2-959c-ae212c7950f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.6296] device (tap87924781-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.6302] device (tap87924781-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:06 compute-0 systemd-machined[152794]: New machine qemu-60-instance-0000007e.
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.641 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c75b0e-017b-40fa-8217-39ab4e897533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 ovn_controller[94912]: 2025-09-30T21:39:06Z|00467|binding|INFO|Setting lport 87924781-e008-4aac-b813-a3d0f3d55f51 ovn-installed in OVS
Sep 30 21:39:06 compute-0 ovn_controller[94912]: 2025-09-30T21:39:06Z|00468|binding|INFO|Setting lport 87924781-e008-4aac-b813-a3d0f3d55f51 up in Southbound
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 systemd[1]: Started Virtual Machine qemu-60-instance-0000007e.
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.671 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[44aa3abb-3ee5-4a96-982a-4f3627c4af7b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.700 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[35288c8a-f217-4fed-8a60-fa1b968acb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.707 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[888a3dfb-9aab-4572-9f3c-7ba1a7dfe540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.7077] manager: (tapea80450e-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/207)
Sep 30 21:39:06 compute-0 systemd-udevd[238920]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.746 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d1aa9fda-d69a-4a72-a517-ed1cbbdcdaf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.749 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4836e6-a961-4da7-8826-fa8e9a14f92c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.7727] device (tapea80450e-b0): carrier: link connected
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.778 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7db450-bf00-45f0-8643-b7398a962564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.795 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9210e72e-7c4c-4151-8f41-9f0fb9f7fd2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea80450e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:34:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504239, 'reachable_time': 35062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238951, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.808 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c4e03dfa-4c7d-4823-b5c8-a182a023baed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1a:340d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504239, 'tstamp': 504239}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238952, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.823 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2d6f57-d25e-49af-b13f-b92497b80763]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapea80450e-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1a:34:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 140], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504239, 'reachable_time': 35062, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238953, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.851 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7cd476-3373-49ca-9b8e-98faa64f7b4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.910 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b0896b70-3e6d-4b80-b7c2-a2567fe8e55d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.912 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea80450e-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.912 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.912 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea80450e-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 NetworkManager[51733]: <info>  [1759268346.9149] manager: (tapea80450e-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Sep 30 21:39:06 compute-0 kernel: tapea80450e-b0: entered promiscuous mode
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.918 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapea80450e-b0, col_values=(('external_ids', {'iface-id': '392087bd-c954-490d-9360-3e29d00c1de8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 ovn_controller[94912]: 2025-09-30T21:39:06Z|00469|binding|INFO|Releasing lport 392087bd-c954-490d-9360-3e29d00c1de8 from this chassis (sb_readonly=0)
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.920 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.921 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4f49f3-7a9a-4433-bcee-15e2be2c26db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.922 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.pid.haproxy
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID ea80450e-b8f2-4af5-a00d-9221e5dd4d97
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:39:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:06.923 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'env', 'PROCESS_TAG=haproxy-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ea80450e-b8f2-4af5-a00d-9221e5dd4d97.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.960 2 DEBUG nova.network.neutron [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Updating instance_info_cache with network_info: [{"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.984 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.985 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Instance network_info: |[{"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.985 2 DEBUG oslo_concurrency.lockutils [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.986 2 DEBUG nova.network.neutron [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Refreshing network info cache for port 77592b52-fef4-4ec1-9e04-78fe87759f03 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.989 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Start _get_guest_xml network_info=[{"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:39:06 compute-0 nova_compute[192810]: 2025-09-30 21:39:06.992 2 WARNING nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.003 2 DEBUG nova.virt.libvirt.host [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.003 2 DEBUG nova.virt.libvirt.host [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.006 2 DEBUG nova.virt.libvirt.host [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.006 2 DEBUG nova.virt.libvirt.host [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.007 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.008 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.008 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.008 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.009 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.009 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.012 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.012 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.012 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.012 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.013 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.013 2 DEBUG nova.virt.hardware [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.017 2 DEBUG nova.virt.libvirt.vif [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=127,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6rbv3nnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:03Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0b1805eb-abb7-44a6-acd9-45548efbe439,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.017 2 DEBUG nova.network.os_vif_util [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.018 2 DEBUG nova.network.os_vif_util [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.019 2 DEBUG nova.objects.instance [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b1805eb-abb7-44a6-acd9-45548efbe439 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.036 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <uuid>0b1805eb-abb7-44a6-acd9-45548efbe439</uuid>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <name>instance-0000007f</name>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersTestJSON-server-262784555</nova:name>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:39:06</nova:creationTime>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         <nova:port uuid="77592b52-fef4-4ec1-9e04-78fe87759f03">
Sep 30 21:39:07 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <system>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="serial">0b1805eb-abb7-44a6-acd9-45548efbe439</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="uuid">0b1805eb-abb7-44a6-acd9-45548efbe439</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </system>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <os>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </os>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <features>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </features>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.config"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:74:27:e0"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <target dev="tap77592b52-fe"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/console.log" append="off"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <video>
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </video>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:39:07 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:39:07 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:39:07 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:39:07 compute-0 nova_compute[192810]: </domain>
Sep 30 21:39:07 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.037 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Preparing to wait for external event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.037 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.038 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.038 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.038 2 DEBUG nova.virt.libvirt.vif [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=127,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6rbv3nnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:03Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0b1805eb-abb7-44a6-acd9-45548efbe439,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.039 2 DEBUG nova.network.os_vif_util [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.039 2 DEBUG nova.network.os_vif_util [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.040 2 DEBUG os_vif [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.041 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.041 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77592b52-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap77592b52-fe, col_values=(('external_ids', {'iface-id': '77592b52-fef4-4ec1-9e04-78fe87759f03', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:27:e0', 'vm-uuid': '0b1805eb-abb7-44a6-acd9-45548efbe439'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.0474] manager: (tap77592b52-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.058 2 INFO os_vif [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe')
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.221 2 DEBUG nova.network.neutron [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Updated VIF entry in instance network info cache for port 87924781-e008-4aac-b813-a3d0f3d55f51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.221 2 DEBUG nova.network.neutron [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Updating instance_info_cache with network_info: [{"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.236 2 DEBUG oslo_concurrency.lockutils [req-0ae818c7-4ab7-4d12-b1fd-d81ad5a12162 req-cd45bd70-aa16-4471-b9c6-cc59105d5e49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-95188da8-afd4-4fbc-9231-7547201f81ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.263 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.263 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.263 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:74:27:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.264 2 INFO nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Using config drive
Sep 30 21:39:07 compute-0 podman[238993]: 2025-09-30 21:39:07.32915538 +0000 UTC m=+0.082224660 container create d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923)
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.360 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268347.3600984, 95188da8-afd4-4fbc-9231-7547201f81ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.361 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] VM Started (Lifecycle Event)
Sep 30 21:39:07 compute-0 podman[238993]: 2025-09-30 21:39:07.277473052 +0000 UTC m=+0.030542362 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.397 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.401 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268347.3602777, 95188da8-afd4-4fbc-9231-7547201f81ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.401 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] VM Paused (Lifecycle Event)
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.436 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.438 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.470 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:07 compute-0 systemd[1]: Started libpod-conmon-d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4.scope.
Sep 30 21:39:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:39:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a01141d6350a7b119794871e5dd5ab1c9d37afc68b06aa0d796886a81235f17/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.575 2 INFO nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Creating config drive at /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.config
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.581 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1f3uoeam execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:07 compute-0 podman[238993]: 2025-09-30 21:39:07.642676785 +0000 UTC m=+0.395746095 container init d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:39:07 compute-0 podman[238993]: 2025-09-30 21:39:07.647929566 +0000 UTC m=+0.400998846 container start d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:39:07 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [NOTICE]   (239015) : New worker (239017) forked
Sep 30 21:39:07 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [NOTICE]   (239015) : Loading success.
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.708 2 DEBUG oslo_concurrency.processutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1f3uoeam" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.7670] manager: (tap77592b52-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/210)
Sep 30 21:39:07 compute-0 kernel: tap77592b52-fe: entered promiscuous mode
Sep 30 21:39:07 compute-0 systemd-udevd[238942]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:07 compute-0 ovn_controller[94912]: 2025-09-30T21:39:07Z|00470|binding|INFO|Claiming lport 77592b52-fef4-4ec1-9e04-78fe87759f03 for this chassis.
Sep 30 21:39:07 compute-0 ovn_controller[94912]: 2025-09-30T21:39:07Z|00471|binding|INFO|77592b52-fef4-4ec1-9e04-78fe87759f03: Claiming fa:16:3e:74:27:e0 10.100.0.7
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.7840] device (tap77592b52-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:07 compute-0 ovn_controller[94912]: 2025-09-30T21:39:07Z|00472|binding|INFO|Setting lport 77592b52-fef4-4ec1-9e04-78fe87759f03 ovn-installed in OVS
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.7854] device (tap77592b52-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 ovn_controller[94912]: 2025-09-30T21:39:07Z|00473|binding|INFO|Setting lport 77592b52-fef4-4ec1-9e04-78fe87759f03 up in Southbound
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.788 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:27:e0 10.100.0.7'], port_security=['fa:16:3e:74:27:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0b1805eb-abb7-44a6-acd9-45548efbe439', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=77592b52-fef4-4ec1-9e04-78fe87759f03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.790 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 77592b52-fef4-4ec1-9e04-78fe87759f03 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.792 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.803 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2ba8d162-53c2-4acd-b06d-f2a8d170e987]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.804 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27086519-61 in ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:39:07 compute-0 systemd-machined[152794]: New machine qemu-61-instance-0000007f.
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.806 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27086519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.807 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[64156528-8010-486e-ab07-033e93fc31df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.808 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e1afda15-b483-4b69-ac55-f2098df3a886]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.819 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[843bdbe0-1b64-4779-8b6d-a0191125c57b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 systemd[1]: Started Virtual Machine qemu-61-instance-0000007f.
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.834 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6036cde6-14e2-4e2f-b870-66118ecadc6f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.840 2 DEBUG nova.compute.manager [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.840 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.840 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.841 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.841 2 DEBUG nova.compute.manager [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Processing event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.841 2 DEBUG nova.compute.manager [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.842 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.842 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.842 2 DEBUG oslo_concurrency.lockutils [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.842 2 DEBUG nova.compute.manager [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] No waiting events found dispatching network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.843 2 WARNING nova.compute.manager [req-41da71a2-14ec-48c6-a317-3cd8104e8d72 req-f35be22e-8806-4a59-abcc-ed855ad0b740 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received unexpected event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 for instance with vm_state building and task_state spawning.
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.843 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.864 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c2fb9b21-fc3d-4301-8100-bbed60638acc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.871 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268347.8469696, 95188da8-afd4-4fbc-9231-7547201f81ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.8731] manager: (tap27086519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/211)
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.875 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea4b7be-8fb8-4df3-8bab-333fca570bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.876 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] VM Resumed (Lifecycle Event)
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.878 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.895 2 INFO nova.virt.libvirt.driver [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Instance spawned successfully.
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.896 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.904 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.908 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.907 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8f6393-4a19-422d-82d8-c50e5c3c3868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.911 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf11cfd-e2b0-4a99-bd85-7cbf466118e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 NetworkManager[51733]: <info>  [1759268347.9348] device (tap27086519-60): carrier: link connected
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.940 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[28b34596-d4d5-490e-8378-4bea3b9658d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.941 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.942 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.942 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.943 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.943 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.944 2 DEBUG nova.virt.libvirt.driver [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:07 compute-0 nova_compute[192810]: 2025-09-30 21:39:07.947 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.958 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[95bfda6d-8f19-4b4f-a886-3c178c10cd33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504355, 'reachable_time': 23254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239057, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.971 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2ba74a8c-c2aa-4982-94c7-2f9e63a0f29d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:b9e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504355, 'tstamp': 504355}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239058, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:07.985 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c9484e-717a-4497-818a-9907838f0839]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 142], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504355, 'reachable_time': 23254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239059, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.012 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e242deee-6a8d-4582-8feb-7f8e7c126af5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.071 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3f9656-f7ae-44dc-86c1-1a5a3f3c1e37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.072 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.073 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.073 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:08 compute-0 NetworkManager[51733]: <info>  [1759268348.0754] manager: (tap27086519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Sep 30 21:39:08 compute-0 kernel: tap27086519-60: entered promiscuous mode
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.078 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:08 compute-0 ovn_controller[94912]: 2025-09-30T21:39:08Z|00474|binding|INFO|Releasing lport f2abb4ad-797b-4767-b8bc-377990516394 from this chassis (sb_readonly=0)
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.091 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.093 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[13e3325e-ba62-4ba7-96c2-8a6bc4e4aaa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.094 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:39:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:08.096 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'env', 'PROCESS_TAG=haproxy-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27086519-6f4c-45f9-8e5b-5b321cd6871c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.100 2 INFO nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Took 4.62 seconds to spawn the instance on the hypervisor.
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.100 2 DEBUG nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.205 2 DEBUG nova.compute.manager [req-e2abe273-b9c7-4d0c-812b-a7405a8f8664 req-18d51a3d-f46f-4c8f-8373-b830d2accfaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.205 2 DEBUG oslo_concurrency.lockutils [req-e2abe273-b9c7-4d0c-812b-a7405a8f8664 req-18d51a3d-f46f-4c8f-8373-b830d2accfaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.206 2 DEBUG oslo_concurrency.lockutils [req-e2abe273-b9c7-4d0c-812b-a7405a8f8664 req-18d51a3d-f46f-4c8f-8373-b830d2accfaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.206 2 DEBUG oslo_concurrency.lockutils [req-e2abe273-b9c7-4d0c-812b-a7405a8f8664 req-18d51a3d-f46f-4c8f-8373-b830d2accfaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.206 2 DEBUG nova.compute.manager [req-e2abe273-b9c7-4d0c-812b-a7405a8f8664 req-18d51a3d-f46f-4c8f-8373-b830d2accfaa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Processing event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.287 2 INFO nova.compute.manager [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Took 5.66 seconds to build instance.
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.337 2 DEBUG oslo_concurrency.lockutils [None req-2fe20f70-2c50-4fa5-a985-8457f3e9e63e 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.441 2 DEBUG nova.network.neutron [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Updated VIF entry in instance network info cache for port 77592b52-fef4-4ec1-9e04-78fe87759f03. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.441 2 DEBUG nova.network.neutron [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Updating instance_info_cache with network_info: [{"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:08 compute-0 podman[239091]: 2025-09-30 21:39:08.444304105 +0000 UTC m=+0.056971991 container create 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:08 compute-0 nova_compute[192810]: 2025-09-30 21:39:08.464 2 DEBUG oslo_concurrency.lockutils [req-9a750193-ad29-4adf-9ecf-970a0e8e23f0 req-f554a764-2068-4c8a-8de3-2b2bca098203 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0b1805eb-abb7-44a6-acd9-45548efbe439" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:08 compute-0 systemd[1]: Started libpod-conmon-542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff.scope.
Sep 30 21:39:08 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:39:08 compute-0 podman[239091]: 2025-09-30 21:39:08.414276037 +0000 UTC m=+0.026943963 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:39:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48f3024be7acf2c55479f63a2025b37b9e2269af184d5f01102e72bf3a6953f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:39:08 compute-0 podman[239091]: 2025-09-30 21:39:08.530895644 +0000 UTC m=+0.143563530 container init 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:08 compute-0 podman[239091]: 2025-09-30 21:39:08.538319639 +0000 UTC m=+0.150987535 container start 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:08 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [NOTICE]   (239116) : New worker (239119) forked
Sep 30 21:39:08 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [NOTICE]   (239116) : Loading success.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.021 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.022 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268349.02087, 0b1805eb-abb7-44a6-acd9-45548efbe439 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.022 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] VM Started (Lifecycle Event)
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.026 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.030 2 INFO nova.virt.libvirt.driver [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Instance spawned successfully.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.030 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.067 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.072 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.081 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.081 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.081 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.082 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.082 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.082 2 DEBUG nova.virt.libvirt.driver [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.104 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.104 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268349.021026, 0b1805eb-abb7-44a6-acd9-45548efbe439 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.104 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] VM Paused (Lifecycle Event)
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.164 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.167 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268349.0253356, 0b1805eb-abb7-44a6-acd9-45548efbe439 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.168 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] VM Resumed (Lifecycle Event)
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.201 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.205 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.254 2 INFO nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Took 5.49 seconds to spawn the instance on the hypervisor.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.255 2 DEBUG nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.266 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.430 2 INFO nova.compute.manager [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Took 6.71 seconds to build instance.
Sep 30 21:39:09 compute-0 nova_compute[192810]: 2025-09-30 21:39:09.487 2 DEBUG oslo_concurrency.lockutils [None req-3635a714-31de-4fae-b32e-9ec83012871c 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.366 2 DEBUG nova.compute.manager [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.367 2 DEBUG oslo_concurrency.lockutils [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.367 2 DEBUG oslo_concurrency.lockutils [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.367 2 DEBUG oslo_concurrency.lockutils [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.368 2 DEBUG nova.compute.manager [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] No waiting events found dispatching network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.369 2 WARNING nova.compute.manager [req-3045862e-2491-42e1-b21c-5d3366d47390 req-6e27fba5-99e5-49eb-8621-4e98cf69ff05 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received unexpected event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 for instance with vm_state active and task_state None.
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.453 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.454 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.455 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.455 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.456 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:10 compute-0 podman[239128]: 2025-09-30 21:39:10.467261586 +0000 UTC m=+0.071824251 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:39:10 compute-0 podman[239129]: 2025-09-30 21:39:10.474818535 +0000 UTC m=+0.074780705 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.528 2 INFO nova.compute.manager [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Terminating instance
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.613 2 DEBUG nova.compute.manager [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:39:10 compute-0 kernel: tap87924781-e0 (unregistering): left promiscuous mode
Sep 30 21:39:10 compute-0 NetworkManager[51733]: <info>  [1759268350.6460] device (tap87924781-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:10 compute-0 ovn_controller[94912]: 2025-09-30T21:39:10Z|00475|binding|INFO|Releasing lport 87924781-e008-4aac-b813-a3d0f3d55f51 from this chassis (sb_readonly=0)
Sep 30 21:39:10 compute-0 ovn_controller[94912]: 2025-09-30T21:39:10Z|00476|binding|INFO|Setting lport 87924781-e008-4aac-b813-a3d0f3d55f51 down in Southbound
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 ovn_controller[94912]: 2025-09-30T21:39:10Z|00477|binding|INFO|Removing iface tap87924781-e0 ovn-installed in OVS
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.683 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:e6:e9 10.100.0.6'], port_security=['fa:16:3e:79:e6:e9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '95188da8-afd4-4fbc-9231-7547201f81ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51c02ace4fff44cca028986381d7c407', 'neutron:revision_number': '4', 'neutron:security_group_ids': '62973e02-a61c-4061-8a33-46cf1b8a21f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d6f1e456-6208-4d0c-8c96-5e3a3d932af6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=87924781-e008-4aac-b813-a3d0f3d55f51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.684 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 87924781-e008-4aac-b813-a3d0f3d55f51 in datapath ea80450e-b8f2-4af5-a00d-9221e5dd4d97 unbound from our chassis
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.686 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.687 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ef95d4b4-b04f-40d9-a67d-ce28f4ff6a69]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.688 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 namespace which is not needed anymore
Sep 30 21:39:10 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007e.scope: Deactivated successfully.
Sep 30 21:39:10 compute-0 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d0000007e.scope: Consumed 3.351s CPU time.
Sep 30 21:39:10 compute-0 systemd-machined[152794]: Machine qemu-60-instance-0000007e terminated.
Sep 30 21:39:10 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [NOTICE]   (239015) : haproxy version is 2.8.14-c23fe91
Sep 30 21:39:10 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [NOTICE]   (239015) : path to executable is /usr/sbin/haproxy
Sep 30 21:39:10 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [WARNING]  (239015) : Exiting Master process...
Sep 30 21:39:10 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [ALERT]    (239015) : Current worker (239017) exited with code 143 (Terminated)
Sep 30 21:39:10 compute-0 neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97[239008]: [WARNING]  (239015) : All workers exited. Exiting... (0)
Sep 30 21:39:10 compute-0 systemd[1]: libpod-d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4.scope: Deactivated successfully.
Sep 30 21:39:10 compute-0 podman[239194]: 2025-09-30 21:39:10.816545722 +0000 UTC m=+0.042495180 container died d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:39:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a01141d6350a7b119794871e5dd5ab1c9d37afc68b06aa0d796886a81235f17-merged.mount: Deactivated successfully.
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.874 2 INFO nova.virt.libvirt.driver [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Instance destroyed successfully.
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.874 2 DEBUG nova.objects.instance [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lazy-loading 'resources' on Instance uuid 95188da8-afd4-4fbc-9231-7547201f81ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:10 compute-0 podman[239194]: 2025-09-30 21:39:10.888047445 +0000 UTC m=+0.113996903 container cleanup d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:39:10 compute-0 systemd[1]: libpod-conmon-d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4.scope: Deactivated successfully.
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.902 2 DEBUG nova.virt.libvirt.vif [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1112694235',display_name='tempest-ServersNegativeTestJSON-server-1112694235',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serversnegativetestjson-server-1112694235',id=126,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='51c02ace4fff44cca028986381d7c407',ramdisk_id='',reservation_id='r-7810uyz9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersNegativeTestJSON-2038911265',owner_user_name='tempest-ServersNegativeTestJSON-2038911265-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:39:08Z,user_data=None,user_id='972edc166c9442b1a83983d15a64e8b6',uuid=95188da8-afd4-4fbc-9231-7547201f81ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.903 2 DEBUG nova.network.os_vif_util [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converting VIF {"id": "87924781-e008-4aac-b813-a3d0f3d55f51", "address": "fa:16:3e:79:e6:e9", "network": {"id": "ea80450e-b8f2-4af5-a00d-9221e5dd4d97", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-1808637609-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "51c02ace4fff44cca028986381d7c407", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87924781-e0", "ovs_interfaceid": "87924781-e008-4aac-b813-a3d0f3d55f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.906 2 DEBUG nova.network.os_vif_util [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.907 2 DEBUG os_vif [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.914 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87924781-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.923 2 INFO os_vif [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:e6:e9,bridge_name='br-int',has_traffic_filtering=True,id=87924781-e008-4aac-b813-a3d0f3d55f51,network=Network(ea80450e-b8f2-4af5-a00d-9221e5dd4d97),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87924781-e0')
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.925 2 INFO nova.virt.libvirt.driver [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Deleting instance files /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed_del
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.928 2 INFO nova.virt.libvirt.driver [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Deletion of /var/lib/nova/instances/95188da8-afd4-4fbc-9231-7547201f81ed_del complete
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.935 2 DEBUG nova.compute.manager [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-unplugged-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.936 2 DEBUG oslo_concurrency.lockutils [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.937 2 DEBUG oslo_concurrency.lockutils [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.938 2 DEBUG oslo_concurrency.lockutils [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.938 2 DEBUG nova.compute.manager [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] No waiting events found dispatching network-vif-unplugged-87924781-e008-4aac-b813-a3d0f3d55f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.938 2 DEBUG nova.compute.manager [req-58e03d03-6131-42ea-bedd-dc2b68f74170 req-83b7b9b7-34bb-4b25-b122-8f297676d77a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-unplugged-87924781-e008-4aac-b813-a3d0f3d55f51 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:39:10 compute-0 podman[239238]: 2025-09-30 21:39:10.945951168 +0000 UTC m=+0.037285541 container remove d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.950 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0f3276-dacd-45fd-b8e1-13a5ab20450f]: (4, ('Tue Sep 30 09:39:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 (d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4)\nd28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4\nTue Sep 30 09:39:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 (d28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4)\nd28cb23c9171f773c689c1ebdac7f3d71a3c9ff5524b8a1e0b669b38e70386b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.952 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c7351d01-15b9-4080-a713-2cfbb8f2f876]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.953 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea80450e-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:10 compute-0 kernel: tapea80450e-b0: left promiscuous mode
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 nova_compute[192810]: 2025-09-30 21:39:10.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:10.974 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dac71476-9dc7-47d2-a77a-7ffdbce63980]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:11.014 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cf99b9a8-7466-40cb-bfa3-e9400854e579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:11.016 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e4cfdd-e4a5-49f1-b219-208d405ef8c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:11.032 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[39ec09e4-9515-450b-bcba-9a7c868e85c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504231, 'reachable_time': 32663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239252, 'error': None, 'target': 'ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:11.035 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ea80450e-b8f2-4af5-a00d-9221e5dd4d97 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:39:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:11.035 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[73032c3c-d422-4e20-b8ac-146d121d6818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:11 compute-0 systemd[1]: run-netns-ovnmeta\x2dea80450e\x2db8f2\x2d4af5\x2da00d\x2d9221e5dd4d97.mount: Deactivated successfully.
Sep 30 21:39:11 compute-0 nova_compute[192810]: 2025-09-30 21:39:11.075 2 INFO nova.compute.manager [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Took 0.46 seconds to destroy the instance on the hypervisor.
Sep 30 21:39:11 compute-0 nova_compute[192810]: 2025-09-30 21:39:11.075 2 DEBUG oslo.service.loopingcall [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:39:11 compute-0 nova_compute[192810]: 2025-09-30 21:39:11.076 2 DEBUG nova.compute.manager [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:39:11 compute-0 nova_compute[192810]: 2025-09-30 21:39:11.076 2 DEBUG nova.network.neutron [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:39:12 compute-0 nova_compute[192810]: 2025-09-30 21:39:12.658 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268337.657474, 2db1c454-9505-4c8b-aeff-f0bba892690f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:12 compute-0 nova_compute[192810]: 2025-09-30 21:39:12.659 2 INFO nova.compute.manager [-] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] VM Stopped (Lifecycle Event)
Sep 30 21:39:12 compute-0 nova_compute[192810]: 2025-09-30 21:39:12.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:12 compute-0 nova_compute[192810]: 2025-09-30 21:39:12.754 2 DEBUG nova.network.neutron [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:12 compute-0 nova_compute[192810]: 2025-09-30 21:39:12.974 2 DEBUG nova.compute.manager [None req-b2cd78c6-b632-4cb9-91fc-c188d0c90662 - - - - - -] [instance: 2db1c454-9505-4c8b-aeff-f0bba892690f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.088 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.106 2 INFO nova.compute.manager [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Took 2.03 seconds to deallocate network for instance.
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.615 2 DEBUG nova.compute.manager [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.615 2 DEBUG oslo_concurrency.lockutils [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.615 2 DEBUG oslo_concurrency.lockutils [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.615 2 DEBUG oslo_concurrency.lockutils [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.615 2 DEBUG nova.compute.manager [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] No waiting events found dispatching network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.616 2 WARNING nova.compute.manager [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received unexpected event network-vif-plugged-87924781-e008-4aac-b813-a3d0f3d55f51 for instance with vm_state active and task_state deleting.
Sep 30 21:39:13 compute-0 nova_compute[192810]: 2025-09-30 21:39:13.616 2 DEBUG nova.compute.manager [req-d95e6946-f5e6-4a26-aa4a-44753fc52811 req-b678d7ac-8158-4604-9403-752cecc1d3fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Received event network-vif-deleted-87924781-e008-4aac-b813-a3d0f3d55f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:14 compute-0 nova_compute[192810]: 2025-09-30 21:39:14.974 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:14 compute-0 nova_compute[192810]: 2025-09-30 21:39:14.975 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:15 compute-0 nova_compute[192810]: 2025-09-30 21:39:15.069 2 DEBUG nova.compute.provider_tree [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:15 compute-0 nova_compute[192810]: 2025-09-30 21:39:15.459 2 DEBUG nova.scheduler.client.report [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:15 compute-0 kernel: tape9ffeb79-64 (unregistering): left promiscuous mode
Sep 30 21:39:15 compute-0 NetworkManager[51733]: <info>  [1759268355.4940] device (tape9ffeb79-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:15 compute-0 ovn_controller[94912]: 2025-09-30T21:39:15Z|00478|binding|INFO|Releasing lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 from this chassis (sb_readonly=0)
Sep 30 21:39:15 compute-0 ovn_controller[94912]: 2025-09-30T21:39:15Z|00479|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 down in Southbound
Sep 30 21:39:15 compute-0 ovn_controller[94912]: 2025-09-30T21:39:15Z|00480|binding|INFO|Removing iface tape9ffeb79-64 ovn-installed in OVS
Sep 30 21:39:15 compute-0 nova_compute[192810]: 2025-09-30 21:39:15.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:15 compute-0 nova_compute[192810]: 2025-09-30 21:39:15.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:15 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Sep 30 21:39:15 compute-0 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007c.scope: Consumed 12.883s CPU time.
Sep 30 21:39:15 compute-0 systemd-machined[152794]: Machine qemu-59-instance-0000007c terminated.
Sep 30 21:39:15 compute-0 nova_compute[192810]: 2025-09-30 21:39:15.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:16 compute-0 nova_compute[192810]: 2025-09-30 21:39:16.099 2 INFO nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance shutdown successfully after 13 seconds.
Sep 30 21:39:16 compute-0 nova_compute[192810]: 2025-09-30 21:39:16.104 2 INFO nova.virt.libvirt.driver [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance destroyed successfully.
Sep 30 21:39:16 compute-0 nova_compute[192810]: 2025-09-30 21:39:16.104 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:16.228 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:16.229 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 unbound from our chassis
Sep 30 21:39:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:16.229 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:39:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:16.231 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[79814ab2-4640-4912-b290-d6efee1fefc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.320 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:17 compute-0 podman[239291]: 2025-09-30 21:39:17.330365947 +0000 UTC m=+0.060980201 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:17 compute-0 podman[239292]: 2025-09-30 21:39:17.347699559 +0000 UTC m=+0.063786951 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:17 compute-0 podman[239293]: 2025-09-30 21:39:17.363464862 +0000 UTC m=+0.077347079 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.561 2 INFO nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Attempting rescue
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.561 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.565 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.565 2 INFO nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Creating image(s)
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.566 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.566 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.566 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.567 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.651 2 INFO nova.scheduler.client.report [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Deleted allocations for instance 95188da8-afd4-4fbc-9231-7547201f81ed
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.865 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.866 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.876 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.941 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:17 compute-0 nova_compute[192810]: 2025-09-30 21:39:17.942 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.rescue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.089 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.rescue" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.090 2 DEBUG oslo_concurrency.lockutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.090 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'migration_context' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.482 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.483 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start _get_guest_xml network_info=[{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1364438883-network", "vif_mac": "fa:16:3e:e9:08:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '86b6907c-d747-4e98-8897-42105915831d', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.483 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'resources' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.880 2 WARNING nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.885 2 DEBUG nova.virt.libvirt.host [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.886 2 DEBUG nova.virt.libvirt.host [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.890 2 DEBUG nova.virt.libvirt.host [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.891 2 DEBUG nova.virt.libvirt.host [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.892 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.893 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.893 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.893 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.894 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.894 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.894 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.894 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.894 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.895 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.895 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.895 2 DEBUG nova.virt.hardware [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:39:18 compute-0 nova_compute[192810]: 2025-09-30 21:39:18.895 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.097 2 DEBUG nova.virt.libvirt.vif [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1575903998',display_name='tempest-ServerRescueTestJSON-server-1575903998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1575903998',id=124,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c5899f27a1e496da636c4e31f6867c6',ramdisk_id='',reservation_id='r-605fltlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1530357589',owner_user_name='tempest-ServerRescueTestJSON-1530357589-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:00Z,user_data=None,user_id='0b16febbab784cc7b40a988decaa1db8',uuid=99707edc-d882-4127-bc26-1fea0a52f9da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1364438883-network", "vif_mac": "fa:16:3e:e9:08:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.098 2 DEBUG nova.network.os_vif_util [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converting VIF {"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1364438883-network", "vif_mac": "fa:16:3e:e9:08:4f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.098 2 DEBUG nova.network.os_vif_util [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.099 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.256 2 DEBUG oslo_concurrency.lockutils [None req-d695622f-bcdf-485e-99ca-55bf2be21843 972edc166c9442b1a83983d15a64e8b6 51c02ace4fff44cca028986381d7c407 - - default default] Lock "95188da8-afd4-4fbc-9231-7547201f81ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.734 2 DEBUG nova.compute.manager [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.735 2 DEBUG oslo_concurrency.lockutils [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.735 2 DEBUG oslo_concurrency.lockutils [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.736 2 DEBUG oslo_concurrency.lockutils [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.736 2 DEBUG nova.compute.manager [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.736 2 WARNING nova.compute.manager [req-8522cbd3-b390-4752-a990-5ccc564e8ca2 req-2bfeb109-7241-4f1e-8856-3b973029e828 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state active and task_state rescuing.
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.754 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <uuid>99707edc-d882-4127-bc26-1fea0a52f9da</uuid>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <name>instance-0000007c</name>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerRescueTestJSON-server-1575903998</nova:name>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:39:18</nova:creationTime>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:user uuid="0b16febbab784cc7b40a988decaa1db8">tempest-ServerRescueTestJSON-1530357589-project-member</nova:user>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:project uuid="9c5899f27a1e496da636c4e31f6867c6">tempest-ServerRescueTestJSON-1530357589</nova:project>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         <nova:port uuid="e9ffeb79-64b3-4a68-93a9-3d612ba40eb5">
Sep 30 21:39:20 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <system>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="serial">99707edc-d882-4127-bc26-1fea0a52f9da</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="uuid">99707edc-d882-4127-bc26-1fea0a52f9da</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </system>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <os>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </os>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <features>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </features>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.rescue"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <target dev="vdb" bus="virtio"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config.rescue"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:e9:08:4f"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <target dev="tape9ffeb79-64"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/console.log" append="off"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <video>
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </video>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:39:20 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:39:20 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:39:20 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:39:20 compute-0 nova_compute[192810]: </domain>
Sep 30 21:39:20 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.760 2 INFO nova.virt.libvirt.driver [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance destroyed successfully.
Sep 30 21:39:20 compute-0 nova_compute[192810]: 2025-09-30 21:39:20.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.070 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.070 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.071 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.071 2 DEBUG nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] No VIF found with MAC fa:16:3e:e9:08:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.072 2 INFO nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Using config drive
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.344 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:22 compute-0 nova_compute[192810]: 2025-09-30 21:39:22.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.009 2 DEBUG nova.objects.instance [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'keypairs' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.232 2 DEBUG nova.compute.manager [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.232 2 DEBUG oslo_concurrency.lockutils [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.232 2 DEBUG oslo_concurrency.lockutils [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.233 2 DEBUG oslo_concurrency.lockutils [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.233 2 DEBUG nova.compute.manager [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.233 2 WARNING nova.compute.manager [req-26c9207c-a75a-42bc-83a3-5b2ed964519c req-40b68b85-5f26-4c7f-b532-143c51d40086 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state active and task_state rescuing.
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.498 2 INFO nova.virt.libvirt.driver [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Creating config drive at /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config.rescue
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.503 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0swos8i4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.629 2 DEBUG oslo_concurrency.processutils [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0swos8i4" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:23 compute-0 kernel: tape9ffeb79-64: entered promiscuous mode
Sep 30 21:39:23 compute-0 NetworkManager[51733]: <info>  [1759268363.6998] manager: (tape9ffeb79-64): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Sep 30 21:39:23 compute-0 systemd-udevd[239399]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:23 compute-0 ovn_controller[94912]: 2025-09-30T21:39:23Z|00481|binding|INFO|Claiming lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for this chassis.
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:23 compute-0 ovn_controller[94912]: 2025-09-30T21:39:23Z|00482|binding|INFO|e9ffeb79-64b3-4a68-93a9-3d612ba40eb5: Claiming fa:16:3e:e9:08:4f 10.100.0.8
Sep 30 21:39:23 compute-0 NetworkManager[51733]: <info>  [1759268363.7535] device (tape9ffeb79-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:23 compute-0 NetworkManager[51733]: <info>  [1759268363.7548] device (tape9ffeb79-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:23 compute-0 ovn_controller[94912]: 2025-09-30T21:39:23Z|00483|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 ovn-installed in OVS
Sep 30 21:39:23 compute-0 nova_compute[192810]: 2025-09-30 21:39:23.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:23 compute-0 systemd-machined[152794]: New machine qemu-62-instance-0000007c.
Sep 30 21:39:23 compute-0 systemd[1]: Started Virtual Machine qemu-62-instance-0000007c.
Sep 30 21:39:23 compute-0 ovn_controller[94912]: 2025-09-30T21:39:23Z|00484|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 up in Southbound
Sep 30 21:39:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:23.824 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:23.825 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 bound to our chassis
Sep 30 21:39:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:23.826 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:39:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:23.827 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29db9225-222b-42f0-89e7-a079901f3b7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:24 compute-0 ovn_controller[94912]: 2025-09-30T21:39:24Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:27:e0 10.100.0.7
Sep 30 21:39:24 compute-0 ovn_controller[94912]: 2025-09-30T21:39:24Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:27:e0 10.100.0.7
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.137 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 99707edc-d882-4127-bc26-1fea0a52f9da due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.138 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268365.1363077, 99707edc-d882-4127-bc26-1fea0a52f9da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.139 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Resumed (Lifecycle Event)
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.172 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.180 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.189 2 DEBUG nova.compute.manager [None req-c88fbc45-c8c8-48b4-afbd-0239371de102 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.222 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] During sync_power_state the instance has a pending task (rescuing). Skip.
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.223 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268365.1383655, 99707edc-d882-4127-bc26-1fea0a52f9da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.223 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Started (Lifecycle Event)
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.321 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.329 2 DEBUG nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.329 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.331 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.332 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.332 2 DEBUG nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.332 2 WARNING nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state active and task_state rescuing.
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.333 2 DEBUG nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.333 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.334 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.334 2 DEBUG oslo_concurrency.lockutils [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.335 2 DEBUG nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.335 2 WARNING nova.compute.manager [req-a71262bf-aadc-4bc8-8e66-3be0279d75ad req-000c9a35-471a-4b6f-b0bf-29791afa15bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state active and task_state rescuing.
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.337 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.872 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268350.8713484, 95188da8-afd4-4fbc-9231-7547201f81ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.872 2 INFO nova.compute.manager [-] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] VM Stopped (Lifecycle Event)
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:25 compute-0 nova_compute[192810]: 2025-09-30 21:39:25.960 2 DEBUG nova.compute.manager [None req-2b55ab2c-00cc-4c54-ae1d-f09e4a158ef7 - - - - - -] [instance: 95188da8-afd4-4fbc-9231-7547201f81ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:26 compute-0 nova_compute[192810]: 2025-09-30 21:39:26.855 2 INFO nova.compute.manager [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Unrescuing
Sep 30 21:39:26 compute-0 nova_compute[192810]: 2025-09-30 21:39:26.856 2 DEBUG oslo_concurrency.lockutils [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:26 compute-0 nova_compute[192810]: 2025-09-30 21:39:26.856 2 DEBUG oslo_concurrency.lockutils [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquired lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:26 compute-0 nova_compute[192810]: 2025-09-30 21:39:26.856 2 DEBUG nova.network.neutron [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:27 compute-0 nova_compute[192810]: 2025-09-30 21:39:27.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.300 2 DEBUG nova.network.neutron [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updating instance_info_cache with network_info: [{"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.315 2 DEBUG oslo_concurrency.lockutils [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Releasing lock "refresh_cache-99707edc-d882-4127-bc26-1fea0a52f9da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.316 2 DEBUG nova.objects.instance [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'flavor' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:28 compute-0 kernel: tape9ffeb79-64 (unregistering): left promiscuous mode
Sep 30 21:39:28 compute-0 NetworkManager[51733]: <info>  [1759268368.4310] device (tape9ffeb79-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00485|binding|INFO|Releasing lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 from this chassis (sb_readonly=0)
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00486|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 down in Southbound
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00487|binding|INFO|Removing iface tape9ffeb79-64 ovn-installed in OVS
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.447 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.448 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 unbound from our chassis
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.449 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.450 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a15c18c-9fe6-4e85-8093-99b8370dd69d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Sep 30 21:39:28 compute-0 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d0000007c.scope: Consumed 4.451s CPU time.
Sep 30 21:39:28 compute-0 systemd-machined[152794]: Machine qemu-62-instance-0000007c terminated.
Sep 30 21:39:28 compute-0 kernel: tape9ffeb79-64: entered promiscuous mode
Sep 30 21:39:28 compute-0 kernel: tape9ffeb79-64 (unregistering): left promiscuous mode
Sep 30 21:39:28 compute-0 NetworkManager[51733]: <info>  [1759268368.5682] manager: (tape9ffeb79-64): new Tun device (/org/freedesktop/NetworkManager/Devices/214)
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.628 2 INFO nova.virt.libvirt.driver [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance destroyed successfully.
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.629 2 DEBUG nova.objects.instance [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'numa_topology' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:28 compute-0 NetworkManager[51733]: <info>  [1759268368.7965] manager: (tape9ffeb79-64): new Tun device (/org/freedesktop/NetworkManager/Devices/215)
Sep 30 21:39:28 compute-0 systemd-udevd[239420]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:28 compute-0 kernel: tape9ffeb79-64: entered promiscuous mode
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00488|binding|INFO|Claiming lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for this chassis.
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00489|binding|INFO|e9ffeb79-64b3-4a68-93a9-3d612ba40eb5: Claiming fa:16:3e:e9:08:4f 10.100.0.8
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.817 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.818 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 bound to our chassis
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.818 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:39:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:28.819 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f69bdc3-cb7f-44f3-bcbc-982de618cd65]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:28 compute-0 NetworkManager[51733]: <info>  [1759268368.8335] device (tape9ffeb79-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:28 compute-0 NetworkManager[51733]: <info>  [1759268368.8341] device (tape9ffeb79-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00490|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 up in Southbound
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 ovn_controller[94912]: 2025-09-30T21:39:28Z|00491|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 ovn-installed in OVS
Sep 30 21:39:28 compute-0 nova_compute[192810]: 2025-09-30 21:39:28.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:28 compute-0 systemd-machined[152794]: New machine qemu-63-instance-0000007c.
Sep 30 21:39:28 compute-0 systemd[1]: Started Virtual Machine qemu-63-instance-0000007c.
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.437 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.809 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 WARNING nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.810 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 WARNING nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.811 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.812 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.812 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.812 2 DEBUG oslo_concurrency.lockutils [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.812 2 DEBUG nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:29 compute-0 nova_compute[192810]: 2025-09-30 21:39:29.812 2 WARNING nova.compute.manager [req-38585820-1365-4410-b210-1702d78f2e0b req-b076970e-5b55-4433-8736-716ecfdf5310 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state rescued and task_state unrescuing.
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.090 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 99707edc-d882-4127-bc26-1fea0a52f9da due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.090 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268370.0901296, 99707edc-d882-4127-bc26-1fea0a52f9da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.091 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Resumed (Lifecycle Event)
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.095 2 DEBUG nova.compute.manager [None req-80f66f10-c701-4665-be89-95a1b904face 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.119 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.122 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.151 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] During sync_power_state the instance has a pending task (unrescuing). Skip.
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.152 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268370.0933113, 99707edc-d882-4127-bc26-1fea0a52f9da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.152 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Started (Lifecycle Event)
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.169 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.174 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:30.772 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:30.773 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:39:30 compute-0 nova_compute[192810]: 2025-09-30 21:39:30.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.908 2 DEBUG nova.compute.manager [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.909 2 DEBUG oslo_concurrency.lockutils [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.909 2 DEBUG oslo_concurrency.lockutils [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.909 2 DEBUG oslo_concurrency.lockutils [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.909 2 DEBUG nova.compute.manager [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:31 compute-0 nova_compute[192810]: 2025-09-30 21:39:31.910 2 WARNING nova.compute.manager [req-9a39c053-9049-46d0-9e79-4d5022929a96 req-1d071df7-8556-4fd2-9fd5-6f9204156037 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state active and task_state None.
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.026 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.027 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.027 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.028 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.028 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.043 2 INFO nova.compute.manager [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Terminating instance
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.057 2 DEBUG nova.compute.manager [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:39:32 compute-0 kernel: tape9ffeb79-64 (unregistering): left promiscuous mode
Sep 30 21:39:32 compute-0 NetworkManager[51733]: <info>  [1759268372.0985] device (tape9ffeb79-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 ovn_controller[94912]: 2025-09-30T21:39:32Z|00492|binding|INFO|Releasing lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 from this chassis (sb_readonly=0)
Sep 30 21:39:32 compute-0 ovn_controller[94912]: 2025-09-30T21:39:32Z|00493|binding|INFO|Setting lport e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 down in Southbound
Sep 30 21:39:32 compute-0 ovn_controller[94912]: 2025-09-30T21:39:32Z|00494|binding|INFO|Removing iface tape9ffeb79-64 ovn-installed in OVS
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:32.119 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e9:08:4f 10.100.0.8'], port_security=['fa:16:3e:e9:08:4f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99707edc-d882-4127-bc26-1fea0a52f9da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ba09ae5f-6031-4fbb-acc3-d581f9322784', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9c5899f27a1e496da636c4e31f6867c6', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f74cc213-8625-4465-9150-d91cf93d6153', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=319a5a77-493a-41ce-8d81-8ffce7598f19, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:32.120 103867 INFO neutron.agent.ovn.metadata.agent [-] Port e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 in datapath ba09ae5f-6031-4fbb-acc3-d581f9322784 unbound from our chassis
Sep 30 21:39:32 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:32.121 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ba09ae5f-6031-4fbb-acc3-d581f9322784 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:39:32 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:32.122 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3fa680-9087-46af-ae57-dbbf34550990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:32 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007c.scope: Deactivated successfully.
Sep 30 21:39:32 compute-0 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d0000007c.scope: Consumed 3.068s CPU time.
Sep 30 21:39:32 compute-0 systemd-machined[152794]: Machine qemu-63-instance-0000007c terminated.
Sep 30 21:39:32 compute-0 podman[239476]: 2025-09-30 21:39:32.201578868 +0000 UTC m=+0.065429232 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:32 compute-0 podman[239477]: 2025-09-30 21:39:32.208720346 +0000 UTC m=+0.075286768 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:39:32 compute-0 podman[239473]: 2025-09-30 21:39:32.223067113 +0000 UTC m=+0.096197358 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.314 2 INFO nova.virt.libvirt.driver [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Instance destroyed successfully.
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.314 2 DEBUG nova.objects.instance [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lazy-loading 'resources' on Instance uuid 99707edc-d882-4127-bc26-1fea0a52f9da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.332 2 DEBUG nova.virt.libvirt.vif [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:38:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1575903998',display_name='tempest-ServerRescueTestJSON-server-1575903998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1575903998',id=124,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9c5899f27a1e496da636c4e31f6867c6',ramdisk_id='',reservation_id='r-605fltlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-1530357589',owner_user_name='tempest-ServerRescueTestJSON-1530357589-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:39:30Z,user_data=None,user_id='0b16febbab784cc7b40a988decaa1db8',uuid=99707edc-d882-4127-bc26-1fea0a52f9da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.332 2 DEBUG nova.network.os_vif_util [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converting VIF {"id": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "address": "fa:16:3e:e9:08:4f", "network": {"id": "ba09ae5f-6031-4fbb-acc3-d581f9322784", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1364438883-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "9c5899f27a1e496da636c4e31f6867c6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ffeb79-64", "ovs_interfaceid": "e9ffeb79-64b3-4a68-93a9-3d612ba40eb5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.333 2 DEBUG nova.network.os_vif_util [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.334 2 DEBUG os_vif [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.337 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9ffeb79-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.343 2 INFO os_vif [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:08:4f,bridge_name='br-int',has_traffic_filtering=True,id=e9ffeb79-64b3-4a68-93a9-3d612ba40eb5,network=Network(ba09ae5f-6031-4fbb-acc3-d581f9322784),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ffeb79-64')
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.344 2 INFO nova.virt.libvirt.driver [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Deleting instance files /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da_del
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.345 2 INFO nova.virt.libvirt.driver [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Deletion of /var/lib/nova/instances/99707edc-d882-4127-bc26-1fea0a52f9da_del complete
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.455 2 INFO nova.compute.manager [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.456 2 DEBUG oslo.service.loopingcall [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.456 2 DEBUG nova.compute.manager [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.456 2 DEBUG nova.network.neutron [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:39:32 compute-0 nova_compute[192810]: 2025-09-30 21:39:32.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.174 2 DEBUG nova.network.neutron [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.193 2 INFO nova.compute.manager [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Took 0.74 seconds to deallocate network for instance.
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.241 2 DEBUG nova.compute.manager [req-3a1fb7f1-20e6-4498-9e7b-534ea220baf3 req-f0bfae32-5656-4306-b56f-d60f236f44f3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-deleted-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.318 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.319 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.395 2 DEBUG nova.compute.provider_tree [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.412 2 DEBUG nova.scheduler.client.report [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.436 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.468 2 INFO nova.scheduler.client.report [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Deleted allocations for instance 99707edc-d882-4127-bc26-1fea0a52f9da
Sep 30 21:39:33 compute-0 nova_compute[192810]: 2025-09-30 21:39:33.565 2 DEBUG oslo_concurrency.lockutils [None req-a4720564-a673-4335-a8bb-4f574343649e 0b16febbab784cc7b40a988decaa1db8 9c5899f27a1e496da636c4e31f6867c6 - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.538s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.313 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.314 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.314 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.315 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.315 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.315 2 WARNING nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-unplugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state deleted and task_state None.
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.316 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.316 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.316 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.316 2 DEBUG oslo_concurrency.lockutils [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "99707edc-d882-4127-bc26-1fea0a52f9da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.317 2 DEBUG nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] No waiting events found dispatching network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.317 2 WARNING nova.compute.manager [req-d0a7e4b0-278d-4fc3-a750-ff40ad3f1bcb req-3faf27fd-c38d-4866-a28d-910b671a60d2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Received unexpected event network-vif-plugged-e9ffeb79-64b3-4a68-93a9-3d612ba40eb5 for instance with vm_state deleted and task_state None.
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.757 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.758 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.758 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.758 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.758 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.768 2 INFO nova.compute.manager [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Terminating instance
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.781 2 DEBUG nova.compute.manager [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:39:34 compute-0 kernel: tap77592b52-fe (unregistering): left promiscuous mode
Sep 30 21:39:34 compute-0 NetworkManager[51733]: <info>  [1759268374.8921] device (tap77592b52-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:34 compute-0 ovn_controller[94912]: 2025-09-30T21:39:34Z|00495|binding|INFO|Releasing lport 77592b52-fef4-4ec1-9e04-78fe87759f03 from this chassis (sb_readonly=0)
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:34 compute-0 ovn_controller[94912]: 2025-09-30T21:39:34Z|00496|binding|INFO|Setting lport 77592b52-fef4-4ec1-9e04-78fe87759f03 down in Southbound
Sep 30 21:39:34 compute-0 ovn_controller[94912]: 2025-09-30T21:39:34Z|00497|binding|INFO|Removing iface tap77592b52-fe ovn-installed in OVS
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:34 compute-0 nova_compute[192810]: 2025-09-30 21:39:34.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:34.920 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:27:e0 10.100.0.7'], port_security=['fa:16:3e:74:27:e0 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0b1805eb-abb7-44a6-acd9-45548efbe439', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=77592b52-fef4-4ec1-9e04-78fe87759f03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:34.921 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 77592b52-fef4-4ec1-9e04-78fe87759f03 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:39:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:34.922 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27086519-6f4c-45f9-8e5b-5b321cd6871c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:39:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:34.923 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[174f0331-b156-44a3-8c32-f55591f53334]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:34.924 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace which is not needed anymore
Sep 30 21:39:34 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007f.scope: Deactivated successfully.
Sep 30 21:39:34 compute-0 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d0000007f.scope: Consumed 14.224s CPU time.
Sep 30 21:39:34 compute-0 systemd-machined[152794]: Machine qemu-61-instance-0000007f terminated.
Sep 30 21:39:34 compute-0 NetworkManager[51733]: <info>  [1759268374.9980] manager: (tap77592b52-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/216)
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.043 2 INFO nova.virt.libvirt.driver [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Instance destroyed successfully.
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.045 2 DEBUG nova.objects.instance [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid 0b1805eb-abb7-44a6-acd9-45548efbe439 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.061 2 DEBUG nova.virt.libvirt.vif [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-262784555',display_name='tempest-ServersTestJSON-server-262784555',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-262784555',id=127,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-6rbv3nnr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:39:09Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0b1805eb-abb7-44a6-acd9-45548efbe439,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.062 2 DEBUG nova.network.os_vif_util [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "77592b52-fef4-4ec1-9e04-78fe87759f03", "address": "fa:16:3e:74:27:e0", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap77592b52-fe", "ovs_interfaceid": "77592b52-fef4-4ec1-9e04-78fe87759f03", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.062 2 DEBUG nova.network.os_vif_util [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.063 2 DEBUG os_vif [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77592b52-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.071 2 INFO os_vif [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:27:e0,bridge_name='br-int',has_traffic_filtering=True,id=77592b52-fef4-4ec1-9e04-78fe87759f03,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap77592b52-fe')
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.072 2 INFO nova.virt.libvirt.driver [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Deleting instance files /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439_del
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.072 2 INFO nova.virt.libvirt.driver [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Deletion of /var/lib/nova/instances/0b1805eb-abb7-44a6-acd9-45548efbe439_del complete
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.178 2 INFO nova.compute.manager [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.178 2 DEBUG oslo.service.loopingcall [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.178 2 DEBUG nova.compute.manager [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.178 2 DEBUG nova.network.neutron [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [NOTICE]   (239116) : haproxy version is 2.8.14-c23fe91
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [NOTICE]   (239116) : path to executable is /usr/sbin/haproxy
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [WARNING]  (239116) : Exiting Master process...
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [WARNING]  (239116) : Exiting Master process...
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [ALERT]    (239116) : Current worker (239119) exited with code 143 (Terminated)
Sep 30 21:39:35 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[239106]: [WARNING]  (239116) : All workers exited. Exiting... (0)
Sep 30 21:39:35 compute-0 systemd[1]: libpod-542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff.scope: Deactivated successfully.
Sep 30 21:39:35 compute-0 podman[239586]: 2025-09-30 21:39:35.357469857 +0000 UTC m=+0.337830281 container died 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.361 2 DEBUG nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-unplugged-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.361 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.362 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.362 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.362 2 DEBUG nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] No waiting events found dispatching network-vif-unplugged-77592b52-fef4-4ec1-9e04-78fe87759f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.362 2 DEBUG nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-unplugged-77592b52-fef4-4ec1-9e04-78fe87759f03 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.362 2 DEBUG nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.363 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.363 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.363 2 DEBUG oslo_concurrency.lockutils [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.364 2 DEBUG nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] No waiting events found dispatching network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:35 compute-0 nova_compute[192810]: 2025-09-30 21:39:35.364 2 WARNING nova.compute.manager [req-992d03e1-54b9-4734-a0ee-bbefe2147af0 req-3257366a-c8e2-4406-a6b8-90cb5406da52 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received unexpected event network-vif-plugged-77592b52-fef4-4ec1-9e04-78fe87759f03 for instance with vm_state active and task_state deleting.
Sep 30 21:39:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-48f3024be7acf2c55479f63a2025b37b9e2269af184d5f01102e72bf3a6953f9-merged.mount: Deactivated successfully.
Sep 30 21:39:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff-userdata-shm.mount: Deactivated successfully.
Sep 30 21:39:35 compute-0 podman[239586]: 2025-09-30 21:39:35.902041471 +0000 UTC m=+0.882401895 container cleanup 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:35 compute-0 systemd[1]: libpod-conmon-542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff.scope: Deactivated successfully.
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.033 2 DEBUG nova.network.neutron [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.071 2 INFO nova.compute.manager [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Took 0.89 seconds to deallocate network for instance.
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.162 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.163 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:36 compute-0 podman[239630]: 2025-09-30 21:39:36.180095612 +0000 UTC m=+0.255372887 container remove 542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.185 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[89f98508-831a-4957-9bff-74dce2df037d]: (4, ('Tue Sep 30 09:39:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff)\n542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff\nTue Sep 30 09:39:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff)\n542c547b430b323e804643f3af2ed5a7c370d30d7a4ec34338c851ba2f2125ff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.186 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab88e28-ccfb-46e7-87e0-4fb23a38952e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.187 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:36 compute-0 kernel: tap27086519-60: left promiscuous mode
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.203 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[37260b4e-8f8f-42f2-81e2-7b13c0e56617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.215 2 DEBUG nova.compute.provider_tree [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.227 2 DEBUG nova.scheduler.client.report [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.227 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4c08e3-8698-4a5c-bb9a-b399f77d6bab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.230 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[408aed61-15e5-47a2-9d26-43bd14202377]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.247 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.258 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b55ac7-8be7-4215-b2d7-e297375d444d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504348, 'reachable_time': 25053, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239646, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.262 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:39:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d27086519\x2d6f4c\x2d45f9\x2d8e5b\x2d5b321cd6871c.mount: Deactivated successfully.
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.262 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7932e40d-cb5a-4dfd-8d70-f344cfc2e353]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.270 2 INFO nova.scheduler.client.report [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance 0b1805eb-abb7-44a6-acd9-45548efbe439
Sep 30 21:39:36 compute-0 nova_compute[192810]: 2025-09-30 21:39:36.359 2 DEBUG oslo_concurrency.lockutils [None req-02afdb94-a0dd-4dc2-9485-60e78ce3619b 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0b1805eb-abb7-44a6-acd9-45548efbe439" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:36 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:36.776 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:37 compute-0 sshd-session[239649]: Invalid user XexdQfE9C2 from 121.15.157.194 port 46644
Sep 30 21:39:37 compute-0 sshd[128205]: error: beginning MaxStartups throttling
Sep 30 21:39:37 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:46676 on [38.102.83.69]:22 past Maxstartups
Sep 30 21:39:37 compute-0 nova_compute[192810]: 2025-09-30 21:39:37.520 2 DEBUG nova.compute.manager [req-1644a1c8-24c8-4d59-9973-5d23e059d101 req-9ea442e1-eab6-44ff-b990-4ac9d9860293 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Received event network-vif-deleted-77592b52-fef4-4ec1-9e04-78fe87759f03 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:37 compute-0 sshd-session[239649]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:39:37 compute-0 sshd-session[239649]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194
Sep 30 21:39:37 compute-0 nova_compute[192810]: 2025-09-30 21:39:37.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:38 compute-0 unix_chkpwd[239691]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239651]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239704]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239654]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239705]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239655]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239712]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239664]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239713]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239659]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239714]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239660]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239715]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239653]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239716]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239663]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239719]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239666]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239720]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239671]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239722]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239673]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239724]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239674]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239725]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239678]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239726]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239677]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239727]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239667]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239728]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239679]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239729]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239682]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 unix_chkpwd[239730]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239680]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:38.745 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:38.745 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:38.745 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:38 compute-0 unix_chkpwd[239741]: password check failed for user (root)
Sep 30 21:39:38 compute-0 sshd-session[239687]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239742]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239689]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239751]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239693]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239752]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239698]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239753]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239692]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239754]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239701]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239755]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239696]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239756]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239706]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239757]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239707]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239758]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239700]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239759]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239717]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239760]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239708]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 nova_compute[192810]: 2025-09-30 21:39:39.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:39 compute-0 unix_chkpwd[239766]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239721]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 sshd-session[239649]: Failed password for invalid user XexdQfE9C2 from 121.15.157.194 port 46644 ssh2
Sep 30 21:39:39 compute-0 unix_chkpwd[239778]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239737]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239779]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239733]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 unix_chkpwd[239780]: password check failed for user (root)
Sep 30 21:39:39 compute-0 unix_chkpwd[239781]: password check failed for user (root)
Sep 30 21:39:39 compute-0 sshd-session[239736]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:39 compute-0 sshd-session[239735]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.021 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.021 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:40 compute-0 unix_chkpwd[239782]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239731]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.046 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:39:40 compute-0 sshd-session[239651]: Failed password for root from 121.15.157.194 port 46646 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.148 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.148 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.156 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.156 2 INFO nova.compute.claims [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:39:40 compute-0 sshd-session[239689]: Failed password for root from 121.15.157.194 port 46696 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.287 2 DEBUG nova.compute.provider_tree [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.310 2 DEBUG nova.scheduler.client.report [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.333 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.337 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:39:40 compute-0 sshd-session[239654]: Failed password for root from 121.15.157.194 port 46652 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.403 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.404 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:39:40 compute-0 sshd-session[239655]: Failed password for root from 121.15.157.194 port 46648 ssh2
Sep 30 21:39:40 compute-0 unix_chkpwd[239799]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239744]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 sshd-session[239664]: Failed password for root from 121.15.157.194 port 46664 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239659]: Failed password for root from 121.15.157.194 port 46654 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.427 2 INFO nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:39:40 compute-0 sshd-session[239660]: Failed password for root from 121.15.157.194 port 46656 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239653]: Failed password for root from 121.15.157.194 port 46650 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.442 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:39:40 compute-0 sshd-session[239663]: Failed password for root from 121.15.157.194 port 46660 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239666]: Failed password for root from 121.15.157.194 port 46658 ssh2
Sep 30 21:39:40 compute-0 unix_chkpwd[239800]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239743]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 sshd-session[239671]: Failed password for root from 121.15.157.194 port 46666 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239673]: Failed password for root from 121.15.157.194 port 46670 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239674]: Failed password for root from 121.15.157.194 port 46668 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239678]: Failed password for root from 121.15.157.194 port 46684 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239693]: Failed password for root from 121.15.157.194 port 46702 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239677]: Failed password for root from 121.15.157.194 port 46680 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239667]: Failed password for root from 121.15.157.194 port 46662 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.561 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:39:40 compute-0 unix_chkpwd[239801]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239747]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.563 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.563 2 INFO nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Creating image(s)
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.563 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.564 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:40 compute-0 sshd-session[239679]: Failed password for root from 121.15.157.194 port 46678 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.564 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.577 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:40 compute-0 sshd-session[239698]: Failed password for root from 121.15.157.194 port 46708 ssh2
Sep 30 21:39:40 compute-0 unix_chkpwd[239803]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239749]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 sshd-session[239692]: Failed password for root from 121.15.157.194 port 46704 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239701]: Failed password for root from 121.15.157.194 port 46712 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239682]: Failed password for root from 121.15.157.194 port 46682 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.634 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.635 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.635 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:40 compute-0 sshd-session[239649]: Connection closed by invalid user XexdQfE9C2 121.15.157.194 port 46644 [preauth]
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.650 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:40 compute-0 sshd-session[239696]: Failed password for root from 121.15.157.194 port 46698 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239680]: Failed password for root from 121.15.157.194 port 46674 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239706]: Failed password for root from 121.15.157.194 port 46724 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239707]: Failed password for root from 121.15.157.194 port 46730 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239700]: Failed password for root from 121.15.157.194 port 46710 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.706 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.707 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:40 compute-0 sshd-session[239717]: Failed password for root from 121.15.157.194 port 46738 ssh2
Sep 30 21:39:40 compute-0 sshd-session[239708]: Failed password for root from 121.15.157.194 port 46722 ssh2
Sep 30 21:39:40 compute-0 unix_chkpwd[239816]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239761]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.772 2 DEBUG nova.policy [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30d0a975d78c4d9a8e2201afdc040092', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ad754242d964bb487a2174b2c21bcc5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.783 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk 1073741824" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.783 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.784 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:40 compute-0 unix_chkpwd[239825]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239763]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 sshd-session[239721]: Failed password for root from 121.15.157.194 port 46736 ssh2
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.855 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.856 2 DEBUG nova.virt.disk.api [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Checking if we can resize image /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.856 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.914 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.915 2 DEBUG nova.virt.disk.api [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Cannot resize image /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.916 2 DEBUG nova.objects.instance [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'migration_context' on Instance uuid 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:40 compute-0 unix_chkpwd[239832]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239769]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 unix_chkpwd[239834]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239768]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 unix_chkpwd[239835]: password check failed for user (root)
Sep 30 21:39:40 compute-0 unix_chkpwd[239836]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239773]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 unix_chkpwd[239837]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239765]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 sshd-session[239770]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 unix_chkpwd[239838]: password check failed for user (root)
Sep 30 21:39:40 compute-0 sshd-session[239776]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.933 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.933 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Ensure instance console log exists: /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.933 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.934 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:40 compute-0 nova_compute[192810]: 2025-09-30 21:39:40.934 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:40 compute-0 sshd-session[239687]: Failed password for root from 121.15.157.194 port 46690 ssh2
Sep 30 21:39:41 compute-0 sshd-session[239689]: Connection closed by authenticating user root 121.15.157.194 port 46696 [preauth]
Sep 30 21:39:41 compute-0 sshd-session[239737]: Failed password for root from 121.15.157.194 port 46758 ssh2
Sep 30 21:39:41 compute-0 sshd-session[239733]: Failed password for root from 121.15.157.194 port 46748 ssh2
Sep 30 21:39:41 compute-0 sshd-session[239736]: Failed password for root from 121.15.157.194 port 46752 ssh2
Sep 30 21:39:41 compute-0 sshd-session[239735]: Failed password for root from 121.15.157.194 port 46750 ssh2
Sep 30 21:39:41 compute-0 sshd-session[239731]: Failed password for root from 121.15.157.194 port 46742 ssh2
Sep 30 21:39:41 compute-0 podman[239845]: 2025-09-30 21:39:41.346667057 +0000 UTC m=+0.071837882 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:39:41 compute-0 podman[239848]: 2025-09-30 21:39:41.348856251 +0000 UTC m=+0.071977685 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Sep 30 21:39:41 compute-0 unix_chkpwd[239904]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239783]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 unix_chkpwd[239905]: password check failed for user (root)
Sep 30 21:39:41 compute-0 unix_chkpwd[239906]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239787]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd-session[239788]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 unix_chkpwd[239907]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239785]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 unix_chkpwd[239908]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239789]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd-session[239693]: Connection closed by authenticating user root 121.15.157.194 port 46702 [preauth]
Sep 30 21:39:41 compute-0 sshd-session[239698]: Connection closed by authenticating user root 121.15.157.194 port 46708 [preauth]
Sep 30 21:39:41 compute-0 unix_chkpwd[239911]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239790]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd-session[239692]: Connection closed by authenticating user root 121.15.157.194 port 46704 [preauth]
Sep 30 21:39:41 compute-0 nova_compute[192810]: 2025-09-30 21:39:41.524 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Successfully created port: 090e6953-4071-4105-98e9-7794c08cf2f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:39:41 compute-0 unix_chkpwd[239912]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239791]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd-session[239701]: Connection closed by authenticating user root 121.15.157.194 port 46712 [preauth]
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #69 from [121.15.157.194]:47042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #69 from [121.15.157.194]:47048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #69 from [121.15.157.194]:47052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #69 from [121.15.157.194]:47046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239707]: Connection closed by authenticating user root 121.15.157.194 port 46730 [preauth]
Sep 30 21:39:41 compute-0 sshd-session[239706]: Connection closed by authenticating user root 121.15.157.194 port 46724 [preauth]
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #69 from [121.15.157.194]:47050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 unix_chkpwd[239913]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239797]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #67 from [121.15.157.194]:47044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239700]: Connection closed by authenticating user root 121.15.157.194 port 46710 [preauth]
Sep 30 21:39:41 compute-0 sshd-session[239696]: Connection closed by authenticating user root 121.15.157.194 port 46698 [preauth]
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #65 from [121.15.157.194]:47054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239717]: Connection closed by authenticating user root 121.15.157.194 port 46738 [preauth]
Sep 30 21:39:41 compute-0 sshd-session[239708]: Connection closed by authenticating user root 121.15.157.194 port 46722 [preauth]
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239744]: Failed password for root from 121.15.157.194 port 46776 ssh2
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239743]: Failed password for root from 121.15.157.194 port 46764 ssh2
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #63 from [121.15.157.194]:47082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239721]: Connection closed by authenticating user root 121.15.157.194 port 46736 [preauth]
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239747]: Failed password for root from 121.15.157.194 port 46790 ssh2
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 unix_chkpwd[239914]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239809]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd-session[239749]: Failed password for root from 121.15.157.194 port 46796 ssh2
Sep 30 21:39:41 compute-0 unix_chkpwd[239915]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239811]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 unix_chkpwd[239916]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239813]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 unix_chkpwd[239917]: password check failed for user (root)
Sep 30 21:39:41 compute-0 sshd-session[239812]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd[128205]: drop connection #62 from [121.15.157.194]:47108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:41 compute-0 sshd-session[239651]: Connection closed by authenticating user root 121.15.157.194 port 46646 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239918]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239829]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239761]: Failed password for root from 121.15.157.194 port 46804 ssh2
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #61 from [121.15.157.194]:47122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239733]: Connection closed by authenticating user root 121.15.157.194 port 46748 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239737]: Connection closed by authenticating user root 121.15.157.194 port 46758 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239735]: Connection closed by authenticating user root 121.15.157.194 port 46750 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239736]: Connection closed by authenticating user root 121.15.157.194 port 46752 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239919]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239822]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd-session[239763]: Failed password for root from 121.15.157.194 port 46806 ssh2
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #57 from [121.15.157.194]:47126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #57 from [121.15.157.194]:47124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239731]: Connection closed by authenticating user root 121.15.157.194 port 46742 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #57 from [121.15.157.194]:47128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239769]: Failed password for root from 121.15.157.194 port 46818 ssh2
Sep 30 21:39:42 compute-0 sshd-session[239768]: Failed password for root from 121.15.157.194 port 46812 ssh2
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239773]: Failed password for root from 121.15.157.194 port 46816 ssh2
Sep 30 21:39:42 compute-0 sshd-session[239770]: Failed password for root from 121.15.157.194 port 46810 ssh2
Sep 30 21:39:42 compute-0 sshd-session[239765]: Failed password for root from 121.15.157.194 port 46808 ssh2
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239776]: Failed password for root from 121.15.157.194 port 46828 ssh2
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239920]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239839]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #56 from [121.15.157.194]:47206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239654]: Connection closed by authenticating user root 121.15.157.194 port 46652 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239655]: Connection closed by authenticating user root 121.15.157.194 port 46648 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239921]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239841]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #55 from [121.15.157.194]:47208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239664]: Connection closed by authenticating user root 121.15.157.194 port 46664 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239922]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239843]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #53 from [121.15.157.194]:47210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239660]: Connection closed by authenticating user root 121.15.157.194 port 46656 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239659]: Connection closed by authenticating user root 121.15.157.194 port 46654 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #51 from [121.15.157.194]:47218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #51 from [121.15.157.194]:47214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #51 from [121.15.157.194]:47212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #51 from [121.15.157.194]:47216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239663]: Connection closed by authenticating user root 121.15.157.194 port 46660 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239653]: Connection closed by authenticating user root 121.15.157.194 port 46650 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239666]: Connection closed by authenticating user root 121.15.157.194 port 46658 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #49 from [121.15.157.194]:47220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239923]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239850]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 unix_chkpwd[239924]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239849]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd-session[239671]: Connection closed by authenticating user root 121.15.157.194 port 46666 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239925]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239851]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #47 from [121.15.157.194]:47222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239678]: Connection closed by authenticating user root 121.15.157.194 port 46684 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239673]: Connection closed by authenticating user root 121.15.157.194 port 46670 [preauth]
Sep 30 21:39:42 compute-0 unix_chkpwd[239926]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239846]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd-session[239674]: Connection closed by authenticating user root 121.15.157.194 port 46668 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.481 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Successfully updated port: 090e6953-4071-4105-98e9-7794c08cf2f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239927]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #44 from [121.15.157.194]:47226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239928]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239847]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 unix_chkpwd[239929]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239901]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd-session[239881]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd-session[239677]: Connection closed by authenticating user root 121.15.157.194 port 46680 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239667]: Connection closed by authenticating user root 121.15.157.194 port 46662 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239679]: Connection closed by authenticating user root 121.15.157.194 port 46678 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #41 from [121.15.157.194]:47236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239744]: Connection closed by authenticating user root 121.15.157.194 port 46776 [preauth]
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.532 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.532 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquired lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.532 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #40 from [121.15.157.194]:47238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239682]: Connection closed by authenticating user root 121.15.157.194 port 46682 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239743]: Connection closed by authenticating user root 121.15.157.194 port 46764 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #38 from [121.15.157.194]:47240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239930]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239889]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #38 from [121.15.157.194]:47242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #38 from [121.15.157.194]:47246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #38 from [121.15.157.194]:47250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239680]: Connection closed by authenticating user root 121.15.157.194 port 46674 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #38 from [121.15.157.194]:47248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.614 2 DEBUG nova.compute.manager [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-changed-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.615 2 DEBUG nova.compute.manager [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Refreshing instance network info cache due to event network-changed-090e6953-4071-4105-98e9-7794c08cf2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.615 2 DEBUG oslo_concurrency.lockutils [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 unix_chkpwd[239931]: password check failed for user (root)
Sep 30 21:39:42 compute-0 sshd-session[239909]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=121.15.157.194  user=root
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #37 from [121.15.157.194]:47264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239747]: Connection closed by authenticating user root 121.15.157.194 port 46790 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 nova_compute[192810]: 2025-09-30 21:39:42.728 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #36 from [121.15.157.194]:47282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239749]: Connection closed by authenticating user root 121.15.157.194 port 46796 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239761]: Connection closed by authenticating user root 121.15.157.194 port 46804 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #35 from [121.15.157.194]:47328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #34 from [121.15.157.194]:47336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #34 from [121.15.157.194]:47340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #34 from [121.15.157.194]:47338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd-session[239763]: Connection closed by authenticating user root 121.15.157.194 port 46806 [preauth]
Sep 30 21:39:42 compute-0 sshd-session[239687]: Connection closed by authenticating user root 121.15.157.194 port 46690 [preauth]
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:42 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd-session[239768]: Connection closed by authenticating user root 121.15.157.194 port 46812 [preauth]
Sep 30 21:39:43 compute-0 sshd-session[239769]: Connection closed by authenticating user root 121.15.157.194 port 46818 [preauth]
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #32 from [121.15.157.194]:47360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd-session[239776]: Connection closed by authenticating user root 121.15.157.194 port 46828 [preauth]
Sep 30 21:39:43 compute-0 sshd-session[239773]: Connection closed by authenticating user root 121.15.157.194 port 46816 [preauth]
Sep 30 21:39:43 compute-0 sshd-session[239770]: Connection closed by authenticating user root 121.15.157.194 port 46810 [preauth]
Sep 30 21:39:43 compute-0 sshd-session[239765]: Connection closed by authenticating user root 121.15.157.194 port 46808 [preauth]
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd-session[239783]: Failed password for root from 121.15.157.194 port 46844 ssh2
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.831 2 DEBUG nova.network.neutron [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Updating instance_info_cache with network_info: [{"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.855 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Releasing lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.856 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Instance network_info: |[{"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.856 2 DEBUG oslo_concurrency.lockutils [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.856 2 DEBUG nova.network.neutron [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Refreshing network info cache for port 090e6953-4071-4105-98e9-7794c08cf2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.858 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Start _get_guest_xml network_info=[{"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd-session[239787]: Failed password for root from 121.15.157.194 port 46864 ssh2
Sep 30 21:39:43 compute-0 sshd-session[239788]: Failed password for root from 121.15.157.194 port 46874 ssh2
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.863 2 WARNING nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.868 2 DEBUG nova.virt.libvirt.host [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.869 2 DEBUG nova.virt.libvirt.host [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.872 2 DEBUG nova.virt.libvirt.host [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.873 2 DEBUG nova.virt.libvirt.host [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.874 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.874 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.874 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.875 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.876 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.876 2 DEBUG nova.virt.hardware [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.879 2 DEBUG nova.virt.libvirt.vif [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1559695799',display_name='tempest-ServersTestJSON-server-1559695799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1559695799',id=129,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-bpe7tv98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:40Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0e051d40-6705-4ff4-b581-ea9dbbdf4a26,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.879 2 DEBUG nova.network.os_vif_util [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.880 2 DEBUG nova.network.os_vif_util [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.880 2 DEBUG nova.objects.instance [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:43 compute-0 sshd-session[239785]: Failed password for root from 121.15.157.194 port 46852 ssh2
Sep 30 21:39:43 compute-0 sshd-session[239789]: Failed password for root from 121.15.157.194 port 46868 ssh2
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.896 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <uuid>0e051d40-6705-4ff4-b581-ea9dbbdf4a26</uuid>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <name>instance-00000081</name>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:name>tempest-ServersTestJSON-server-1559695799</nova:name>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:39:43</nova:creationTime>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:user uuid="30d0a975d78c4d9a8e2201afdc040092">tempest-ServersTestJSON-782690373-project-member</nova:user>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:project uuid="8ad754242d964bb487a2174b2c21bcc5">tempest-ServersTestJSON-782690373</nova:project>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         <nova:port uuid="090e6953-4071-4105-98e9-7794c08cf2f5">
Sep 30 21:39:43 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <system>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="serial">0e051d40-6705-4ff4-b581-ea9dbbdf4a26</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="uuid">0e051d40-6705-4ff4-b581-ea9dbbdf4a26</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </system>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <os>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </os>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <features>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </features>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.config"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:d7:60:2b"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <target dev="tap090e6953-40"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/console.log" append="off"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <video>
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </video>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:39:43 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:39:43 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:39:43 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:39:43 compute-0 nova_compute[192810]: </domain>
Sep 30 21:39:43 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.898 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Preparing to wait for external event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.898 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.898 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.899 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.899 2 DEBUG nova.virt.libvirt.vif [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1559695799',display_name='tempest-ServersTestJSON-server-1559695799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1559695799',id=129,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-bpe7tv98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:40Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0e051d40-6705-4ff4-b581-ea9dbbdf4a26,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.900 2 DEBUG nova.network.os_vif_util [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.900 2 DEBUG nova.network.os_vif_util [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.901 2 DEBUG os_vif [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.902 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.905 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap090e6953-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.906 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap090e6953-40, col_values=(('external_ids', {'iface-id': '090e6953-4071-4105-98e9-7794c08cf2f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:60:2b', 'vm-uuid': '0e051d40-6705-4ff4-b581-ea9dbbdf4a26'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:43 compute-0 NetworkManager[51733]: <info>  [1759268383.9081] manager: (tap090e6953-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/217)
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:43 compute-0 nova_compute[192810]: 2025-09-30 21:39:43.914 2 INFO os_vif [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40')
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd-session[239790]: Failed password for root from 121.15.157.194 port 46886 ssh2
Sep 30 21:39:43 compute-0 sshd-session[239791]: Failed password for root from 121.15.157.194 port 46888 ssh2
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:43 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239797]: Failed password for root from 121.15.157.194 port 46902 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.124 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.124 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.124 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] No VIF found with MAC fa:16:3e:d7:60:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.125 2 INFO nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Using config drive
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239829]: Failed password for root from 121.15.157.194 port 46964 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239809]: Failed password for root from 121.15.157.194 port 46936 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239811]: Failed password for root from 121.15.157.194 port 46934 ssh2
Sep 30 21:39:44 compute-0 sshd-session[239813]: Failed password for root from 121.15.157.194 port 46938 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239822]: Failed password for root from 121.15.157.194 port 46944 ssh2
Sep 30 21:39:44 compute-0 sshd-session[239812]: Failed password for root from 121.15.157.194 port 46932 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.508 2 INFO nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Creating config drive at /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.config
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.512 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfdrwdbs5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:39:44 compute-0 sshd-session[239839]: Failed password for root from 121.15.157.194 port 46972 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239841]: Failed password for root from 121.15.157.194 port 46986 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239843]: Failed password for root from 121.15.157.194 port 46988 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.636 2 DEBUG oslo_concurrency.processutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfdrwdbs5" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239850]: Failed password for root from 121.15.157.194 port 47010 ssh2
Sep 30 21:39:44 compute-0 sshd-session[239849]: Failed password for root from 121.15.157.194 port 47002 ssh2
Sep 30 21:39:44 compute-0 sshd-session[239851]: Failed password for root from 121.15.157.194 port 47006 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 kernel: tap090e6953-40: entered promiscuous mode
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.6874] manager: (tap090e6953-40): new Tun device (/org/freedesktop/NetworkManager/Devices/218)
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 ovn_controller[94912]: 2025-09-30T21:39:44Z|00498|binding|INFO|Claiming lport 090e6953-4071-4105-98e9-7794c08cf2f5 for this chassis.
Sep 30 21:39:44 compute-0 ovn_controller[94912]: 2025-09-30T21:39:44Z|00499|binding|INFO|090e6953-4071-4105-98e9-7794c08cf2f5: Claiming fa:16:3e:d7:60:2b 10.100.0.8
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.699 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:60:2b 10.100.0.8'], port_security=['fa:16:3e:d7:60:2b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0e051d40-6705-4ff4-b581-ea9dbbdf4a26', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=090e6953-4071-4105-98e9-7794c08cf2f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.700 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 090e6953-4071-4105-98e9-7794c08cf2f5 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c bound to our chassis
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.701 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:44 compute-0 sshd-session[239846]: Failed password for root from 121.15.157.194 port 47000 ssh2
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.711 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8a6b5d90-19ba-4ae7-87ee-f7335f58aff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.711 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap27086519-61 in ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.713 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap27086519-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.713 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9c60ed18-5cb0-47fd-8464-0640925187ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 systemd-udevd[239953]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.714 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[08140f11-6788-4df0-a9de-550b1b9bae6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 systemd-machined[152794]: New machine qemu-64-instance-00000081.
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.7247] device (tap090e6953-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.7254] device (tap090e6953-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.724 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[008358c3-8507-46a3-a656-0b932d30ee54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 systemd[1]: Started Virtual Machine qemu-64-instance-00000081.
Sep 30 21:39:44 compute-0 sshd-session[239847]: Failed password for root from 121.15.157.194 port 47004 ssh2
Sep 30 21:39:44 compute-0 sshd-session[239881]: Failed password for root from 121.15.157.194 port 47018 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239901]: Failed password for root from 121.15.157.194 port 47030 ssh2
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.752 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e257e011-c293-4161-98af-e0beab0b3b50]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 ovn_controller[94912]: 2025-09-30T21:39:44Z|00500|binding|INFO|Setting lport 090e6953-4071-4105-98e9-7794c08cf2f5 ovn-installed in OVS
Sep 30 21:39:44 compute-0 ovn_controller[94912]: 2025-09-30T21:39:44Z|00501|binding|INFO|Setting lport 090e6953-4071-4105-98e9-7794c08cf2f5 up in Southbound
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.780 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3b0967e2-10ac-4f5a-a00a-4c94d9096c99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 systemd-udevd[239956]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.7862] manager: (tap27086519-60): new Veth device (/org/freedesktop/NetworkManager/Devices/219)
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.785 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[50816c34-224d-421c-97d4-5fa6cf3aabf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.804 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:39:44 compute-0 sshd-session[239889]: Failed password for root from 121.15.157.194 port 47020 ssh2
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.815 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e9d75e-ac90-451e-bc81-97b96d0fb806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.818 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c423c3bd-b9ab-4b79-877d-49ab89387620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.8347] device (tap27086519-60): carrier: link connected
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.838 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d10957-be5f-4061-9139-2986c4089dcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.853 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0bc96d-73a9-42c2-8fed-9bb71bb1bf9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508045, 'reachable_time': 36649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 239985, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.868 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[240d4b20-1655-46e9-b8eb-fe38f06fda6a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:b9e3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 508045, 'tstamp': 508045}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 239986, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.884 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[20ffc172-2f5d-49d9-a053-7105a644b289]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap27086519-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:b9:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508045, 'reachable_time': 36649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 239987, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd-session[239909]: Failed password for root from 121.15.157.194 port 47040 ssh2
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.909 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7b21feaa-9a3f-4ae8-84b6-7928bd5dbbd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.959 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ca48687e-eec3-4f0d-820b-895e62d58cd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.960 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.960 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.960 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27086519-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 NetworkManager[51733]: <info>  [1759268384.9631] manager: (tap27086519-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Sep 30 21:39:44 compute-0 kernel: tap27086519-60: entered promiscuous mode
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.965 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap27086519-60, col_values=(('external_ids', {'iface-id': 'f2abb4ad-797b-4767-b8bc-377990516394'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 ovn_controller[94912]: 2025-09-30T21:39:44Z|00502|binding|INFO|Releasing lport f2abb4ad-797b-4767-b8bc-377990516394 from this chassis (sb_readonly=0)
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.967 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.968 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[372acb17-d28d-4f12-9005-47ba1d49fb9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.969 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/27086519-6f4c-45f9-8e5b-5b321cd6871c.pid.haproxy
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 27086519-6f4c-45f9-8e5b-5b321cd6871c
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:39:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:44.970 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'env', 'PROCESS_TAG=haproxy-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/27086519-6f4c-45f9-8e5b-5b321cd6871c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:39:44 compute-0 nova_compute[192810]: 2025-09-30 21:39:44.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:44 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.092 2 DEBUG nova.compute.manager [req-2ae65a3d-f655-4645-baf5-9a2830519deb req-e433c3b7-34bf-47a3-a0b3-1c2461665ecc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.093 2 DEBUG oslo_concurrency.lockutils [req-2ae65a3d-f655-4645-baf5-9a2830519deb req-e433c3b7-34bf-47a3-a0b3-1c2461665ecc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.093 2 DEBUG oslo_concurrency.lockutils [req-2ae65a3d-f655-4645-baf5-9a2830519deb req-e433c3b7-34bf-47a3-a0b3-1c2461665ecc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.093 2 DEBUG oslo_concurrency.lockutils [req-2ae65a3d-f655-4645-baf5-9a2830519deb req-e433c3b7-34bf-47a3-a0b3-1c2461665ecc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.094 2 DEBUG nova.compute.manager [req-2ae65a3d-f655-4645-baf5-9a2830519deb req-e433c3b7-34bf-47a3-a0b3-1c2461665ecc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Processing event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:47996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.229 2 DEBUG nova.network.neutron [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Updated VIF entry in instance network info cache for port 090e6953-4071-4105-98e9-7794c08cf2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.230 2 DEBUG nova.network.neutron [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Updating instance_info_cache with network_info: [{"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.244 2 DEBUG oslo_concurrency.lockutils [req-a772ff59-5966-4c09-8185-151b5613594d req-143d7ced-0528-479a-9996-85c728725eba dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0e051d40-6705-4ff4-b581-ea9dbbdf4a26" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239783]: Connection closed by authenticating user root 121.15.157.194 port 46844 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #26 from [121.15.157.194]:48056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239787]: Connection closed by authenticating user root 121.15.157.194 port 46864 [preauth]
Sep 30 21:39:45 compute-0 podman[240026]: 2025-09-30 21:39:45.297226064 +0000 UTC m=+0.024069701 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239788]: Connection closed by authenticating user root 121.15.157.194 port 46874 [preauth]
Sep 30 21:39:45 compute-0 sshd-session[239789]: Connection closed by authenticating user root 121.15.157.194 port 46868 [preauth]
Sep 30 21:39:45 compute-0 sshd-session[239785]: Connection closed by authenticating user root 121.15.157.194 port 46852 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #25 from [121.15.157.194]:48058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #22 from [121.15.157.194]:48064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #22 from [121.15.157.194]:48062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #22 from [121.15.157.194]:48066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239790]: Connection closed by authenticating user root 121.15.157.194 port 46886 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #21 from [121.15.157.194]:48112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239791]: Connection closed by authenticating user root 121.15.157.194 port 46888 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #20 from [121.15.157.194]:48114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239797]: Connection closed by authenticating user root 121.15.157.194 port 46902 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #19 from [121.15.157.194]:48146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.703 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268385.7026951, 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.703 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] VM Started (Lifecycle Event)
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.705 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.708 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.711 2 INFO nova.virt.libvirt.driver [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Instance spawned successfully.
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.711 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.730 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.734 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.737 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.737 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.737 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.738 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.738 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.739 2 DEBUG nova.virt.libvirt.driver [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.759 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.760 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268385.7028506, 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.760 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] VM Paused (Lifecycle Event)
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.781 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.784 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268385.7071283, 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.784 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] VM Resumed (Lifecycle Event)
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.801 2 INFO nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Took 5.24 seconds to spawn the instance on the hypervisor.
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.801 2 DEBUG nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.807 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.809 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #18 from [121.15.157.194]:48210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.842 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:39:45 compute-0 sshd-session[239809]: Connection closed by authenticating user root 121.15.157.194 port 46936 [preauth]
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #17 from [121.15.157.194]:48218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #17 from [121.15.157.194]:48216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd-session[239811]: Connection closed by authenticating user root 121.15.157.194 port 46934 [preauth]
Sep 30 21:39:45 compute-0 sshd-session[239813]: Connection closed by authenticating user root 121.15.157.194 port 46938 [preauth]
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.885 2 INFO nova.compute.manager [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Took 5.77 seconds to build instance.
Sep 30 21:39:45 compute-0 sshd-session[239812]: Connection closed by authenticating user root 121.15.157.194 port 46932 [preauth]
Sep 30 21:39:45 compute-0 nova_compute[192810]: 2025-09-30 21:39:45.911 2 DEBUG oslo_concurrency.lockutils [None req-fecd1251-418d-4e91-824a-74eedf9b0d8f 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.890s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:45 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #14 from [121.15.157.194]:48312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239829]: Connection closed by authenticating user root 121.15.157.194 port 46964 [preauth]
Sep 30 21:39:46 compute-0 sshd-session[239822]: Connection closed by authenticating user root 121.15.157.194 port 46944 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #12 from [121.15.157.194]:48356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239839]: Connection closed by authenticating user root 121.15.157.194 port 46972 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239841]: Connection closed by authenticating user root 121.15.157.194 port 46986 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #11 from [121.15.157.194]:48400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239843]: Connection closed by authenticating user root 121.15.157.194 port 46988 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239850]: Connection closed by authenticating user root 121.15.157.194 port 47010 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #9 from [121.15.157.194]:48446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239849]: Connection closed by authenticating user root 121.15.157.194 port 47002 [preauth]
Sep 30 21:39:46 compute-0 sshd-session[239851]: Connection closed by authenticating user root 121.15.157.194 port 47006 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #7 from [121.15.157.194]:48454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #7 from [121.15.157.194]:48456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239881]: Connection closed by authenticating user root 121.15.157.194 port 47018 [preauth]
Sep 30 21:39:46 compute-0 sshd-session[239847]: Connection closed by authenticating user root 121.15.157.194 port 47004 [preauth]
Sep 30 21:39:46 compute-0 sshd-session[239901]: Connection closed by authenticating user root 121.15.157.194 port 47030 [preauth]
Sep 30 21:39:46 compute-0 sshd-session[239846]: Connection closed by authenticating user root 121.15.157.194 port 47000 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #6 from [121.15.157.194]:48492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #2 from [121.15.157.194]:48494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #2 from [121.15.157.194]:48486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #2 from [121.15.157.194]:48496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #2 from [121.15.157.194]:48498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #2 from [121.15.157.194]:48500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239889]: Connection closed by authenticating user root 121.15.157.194 port 47020 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #1 from [121.15.157.194]:48522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd-session[239909]: Connection closed by authenticating user root 121.15.157.194 port 47040 [preauth]
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 podman[240026]: 2025-09-30 21:39:46.766091574 +0000 UTC m=+1.492935191 container create df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:46 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 systemd[1]: Started libpod-conmon-df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544.scope.
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0779ed567dfb4d9c42c279bc32cfa2be6cb3988b61b8e5eaab72425460a5b863/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.237 2 DEBUG nova.compute.manager [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.238 2 DEBUG oslo_concurrency.lockutils [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.238 2 DEBUG oslo_concurrency.lockutils [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.238 2 DEBUG oslo_concurrency.lockutils [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.238 2 DEBUG nova.compute.manager [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] No waiting events found dispatching network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.238 2 WARNING nova.compute.manager [req-519a4e9a-85ed-4b0d-b600-710e94935746 req-d916313f-ba0f-4fb4-bdbb-9802ea2f4071 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received unexpected event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 for instance with vm_state active and task_state None.
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.312 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268372.311576, 99707edc-d882-4127-bc26-1fea0a52f9da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.313 2 INFO nova.compute.manager [-] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] VM Stopped (Lifecycle Event)
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.344 2 DEBUG nova.compute.manager [None req-f55be15f-6b8b-455f-91fc-f8fc2a566bce - - - - - -] [instance: 99707edc-d882-4127-bc26-1fea0a52f9da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 podman[240026]: 2025-09-30 21:39:47.456654166 +0000 UTC m=+2.183497873 container init df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:47 compute-0 podman[240026]: 2025-09-30 21:39:47.462224445 +0000 UTC m=+2.189068052 container start df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:39:47 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [NOTICE]   (240045) : New worker (240047) forked
Sep 30 21:39:47 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [NOTICE]   (240045) : Loading success.
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 nova_compute[192810]: 2025-09-30 21:39:47.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:48998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:47 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 podman[240057]: 2025-09-30 21:39:48.318845526 +0000 UTC m=+0.053965176 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 podman[240056]: 2025-09-30 21:39:48.326358864 +0000 UTC m=+0.060562631 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 podman[240058]: 2025-09-30 21:39:48.349550182 +0000 UTC m=+0.069523754 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 nova_compute[192810]: 2025-09-30 21:39:48.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:48 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:49 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 nova_compute[192810]: 2025-09-30 21:39:50.042 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268375.0416217, 0b1805eb-abb7-44a6-acd9-45548efbe439 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:39:50 compute-0 nova_compute[192810]: 2025-09-30 21:39:50.043 2 INFO nova.compute.manager [-] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] VM Stopped (Lifecycle Event)
Sep 30 21:39:50 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:39:50 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 nova_compute[192810]: 2025-09-30 21:39:50.098 2 DEBUG nova.compute.manager [None req-dbc623ef-aced-4cc4-bd0d-beae239cd06b - - - - - -] [instance: 0b1805eb-abb7-44a6-acd9-45548efbe439] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:49998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:50 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.435 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.575 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Triggering sync for uuid 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.576 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.576 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.679 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 nova_compute[192810]: 2025-09-30 21:39:51.929 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:51 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.044 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.044 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.045 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.045 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.045 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.057 2 INFO nova.compute.manager [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Terminating instance
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.068 2 DEBUG nova.compute.manager [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 kernel: tap090e6953-40 (unregistering): left promiscuous mode
Sep 30 21:39:52 compute-0 NetworkManager[51733]: <info>  [1759268392.1014] device (tap090e6953-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 ovn_controller[94912]: 2025-09-30T21:39:52Z|00503|binding|INFO|Releasing lport 090e6953-4071-4105-98e9-7794c08cf2f5 from this chassis (sb_readonly=0)
Sep 30 21:39:52 compute-0 ovn_controller[94912]: 2025-09-30T21:39:52Z|00504|binding|INFO|Setting lport 090e6953-4071-4105-98e9-7794c08cf2f5 down in Southbound
Sep 30 21:39:52 compute-0 ovn_controller[94912]: 2025-09-30T21:39:52Z|00505|binding|INFO|Removing iface tap090e6953-40 ovn-installed in OVS
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:52.130 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:60:2b 10.100.0.8'], port_security=['fa:16:3e:d7:60:2b 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0e051d40-6705-4ff4-b581-ea9dbbdf4a26', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8ad754242d964bb487a2174b2c21bcc5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9c41899e-24c3-4632-81c5-100a69d8be81', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f4d6c701-a212-4977-9c52-b553d410c9c7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=090e6953-4071-4105-98e9-7794c08cf2f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:39:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:52.133 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 090e6953-4071-4105-98e9-7794c08cf2f5 in datapath 27086519-6f4c-45f9-8e5b-5b321cd6871c unbound from our chassis
Sep 30 21:39:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:52.135 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 27086519-6f4c-45f9-8e5b-5b321cd6871c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:39:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:52.136 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[54a74432-955f-4ead-a427-8f09c341230d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:52.136 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c namespace which is not needed anymore
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000081.scope: Deactivated successfully.
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000081.scope: Consumed 7.027s CPU time.
Sep 30 21:39:52 compute-0 systemd-machined[152794]: Machine qemu-64-instance-00000081 terminated.
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.320 2 INFO nova.virt.libvirt.driver [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Instance destroyed successfully.
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.320 2 DEBUG nova.objects.instance [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lazy-loading 'resources' on Instance uuid 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [NOTICE]   (240045) : haproxy version is 2.8.14-c23fe91
Sep 30 21:39:52 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [NOTICE]   (240045) : path to executable is /usr/sbin/haproxy
Sep 30 21:39:52 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [WARNING]  (240045) : Exiting Master process...
Sep 30 21:39:52 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [ALERT]    (240045) : Current worker (240047) exited with code 143 (Terminated)
Sep 30 21:39:52 compute-0 neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c[240041]: [WARNING]  (240045) : All workers exited. Exiting... (0)
Sep 30 21:39:52 compute-0 systemd[1]: libpod-df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544.scope: Deactivated successfully.
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.348 2 DEBUG nova.virt.libvirt.vif [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:202:202,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-1559695799',display_name='tempest-ServersTestJSON-server-1559695799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-1559695799',id=129,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:39:45Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8ad754242d964bb487a2174b2c21bcc5',ramdisk_id='',reservation_id='r-bpe7tv98',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-782690373',owner_user_name='tempest-ServersTestJSON-782690373-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:39:49Z,user_data=None,user_id='30d0a975d78c4d9a8e2201afdc040092',uuid=0e051d40-6705-4ff4-b581-ea9dbbdf4a26,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.348 2 DEBUG nova.network.os_vif_util [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converting VIF {"id": "090e6953-4071-4105-98e9-7794c08cf2f5", "address": "fa:16:3e:d7:60:2b", "network": {"id": "27086519-6f4c-45f9-8e5b-5b321cd6871c", "bridge": "br-int", "label": "tempest-ServersTestJSON-937918271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8ad754242d964bb487a2174b2c21bcc5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap090e6953-40", "ovs_interfaceid": "090e6953-4071-4105-98e9-7794c08cf2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:39:52 compute-0 podman[240144]: 2025-09-30 21:39:52.349160501 +0000 UTC m=+0.133961510 container died df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.349 2 DEBUG nova.network.os_vif_util [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.350 2 DEBUG os_vif [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090e6953-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.357 2 INFO os_vif [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:60:2b,bridge_name='br-int',has_traffic_filtering=True,id=090e6953-4071-4105-98e9-7794c08cf2f5,network=Network(27086519-6f4c-45f9-8e5b-5b321cd6871c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap090e6953-40')
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.357 2 INFO nova.virt.libvirt.driver [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Deleting instance files /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26_del
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.358 2 INFO nova.virt.libvirt.driver [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Deletion of /var/lib/nova/instances/0e051d40-6705-4ff4-b581-ea9dbbdf4a26_del complete
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:50990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544-userdata-shm.mount: Deactivated successfully.
Sep 30 21:39:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-0779ed567dfb4d9c42c279bc32cfa2be6cb3988b61b8e5eaab72425460a5b863-merged.mount: Deactivated successfully.
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.669 2 INFO nova.compute.manager [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Took 0.60 seconds to destroy the instance on the hypervisor.
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.669 2 DEBUG oslo.service.loopingcall [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.670 2 DEBUG nova.compute.manager [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.670 2 DEBUG nova.network.neutron [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:52 compute-0 podman[240144]: 2025-09-30 21:39:52.694463958 +0000 UTC m=+0.479264957 container cleanup df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:39:52 compute-0 systemd[1]: libpod-conmon-df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544.scope: Deactivated successfully.
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.713 2 DEBUG nova.compute.manager [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-unplugged-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.714 2 DEBUG oslo_concurrency.lockutils [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.714 2 DEBUG oslo_concurrency.lockutils [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.714 2 DEBUG oslo_concurrency.lockutils [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.714 2 DEBUG nova.compute.manager [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] No waiting events found dispatching network-vif-unplugged-090e6953-4071-4105-98e9-7794c08cf2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:52 compute-0 nova_compute[192810]: 2025-09-30 21:39:52.715 2 DEBUG nova.compute.manager [req-ffc4d0b6-d066-470c-ac2c-c121e74738eb req-489337db-4830-4bf3-a67c-f6eba69ff197 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-unplugged-090e6953-4071-4105-98e9-7794c08cf2f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:52 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 podman[240191]: 2025-09-30 21:39:53.049729792 +0000 UTC m=+0.336723963 container remove df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.055 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[88917f6b-c6aa-4a41-84e1-b157d3a0dc9e]: (4, ('Tue Sep 30 09:39:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544)\ndf4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544\nTue Sep 30 09:39:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c (df4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544)\ndf4461843485bc24727bbaa615c94b9ddb9453997ad4d9872e9f086fa5d93544\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.057 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9876381f-e7fc-4760-8440-2a798e6e4237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.057 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27086519-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:39:53 compute-0 kernel: tap27086519-60: left promiscuous mode
Sep 30 21:39:53 compute-0 nova_compute[192810]: 2025-09-30 21:39:53.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:53 compute-0 nova_compute[192810]: 2025-09-30 21:39:53.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.078 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[92a15250-9e7e-4aaf-aebf-4297524031e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.102 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7424a3c6-1d39-4f89-8b65-73daacaf5966]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.104 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[684ff8c6-5377-4cf2-a1f7-72d2abc5ac8a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.117 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a433f88-7cd6-49b7-9561-6d326b04cf1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 508039, 'reachable_time': 30362, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240207, 'error': None, 'target': 'ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.119 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-27086519-6f4c-45f9-8e5b-5b321cd6871c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:39:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:39:53.119 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[82297003-32d7-4e97-9a53-901ce591ed86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:39:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d27086519\x2d6f4c\x2d45f9\x2d8e5b\x2d5b321cd6871c.mount: Deactivated successfully.
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 nova_compute[192810]: 2025-09-30 21:39:53.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 nova_compute[192810]: 2025-09-30 21:39:53.842 2 DEBUG nova.network.neutron [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:53 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.009 2 INFO nova.compute.manager [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Took 1.34 seconds to deallocate network for instance.
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:51998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.798 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.798 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.980 2 DEBUG nova.compute.manager [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.981 2 DEBUG oslo_concurrency.lockutils [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.981 2 DEBUG oslo_concurrency.lockutils [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.982 2 DEBUG oslo_concurrency.lockutils [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.982 2 DEBUG nova.compute.manager [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] No waiting events found dispatching network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.982 2 WARNING nova.compute.manager [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received unexpected event network-vif-plugged-090e6953-4071-4105-98e9-7794c08cf2f5 for instance with vm_state deleted and task_state None.
Sep 30 21:39:54 compute-0 nova_compute[192810]: 2025-09-30 21:39:54.983 2 DEBUG nova.compute.manager [req-71c2030b-fc33-4619-9a42-bdf0a50fcd67 req-af984b74-6ec8-4c86-ab29-5e11b483ee6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Received event network-vif-deleted-090e6953-4071-4105-98e9-7794c08cf2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:54 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.071 2 DEBUG nova.compute.provider_tree [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.215 2 DEBUG nova.scheduler.client.report [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.504 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.632 2 INFO nova.scheduler.client.report [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Deleted allocations for instance 0e051d40-6705-4ff4-b581-ea9dbbdf4a26
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.987 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.987 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.988 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.988 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:39:55 compute-0 nova_compute[192810]: 2025-09-30 21:39:55.988 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.127 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.127 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.128 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.128 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.263 2 DEBUG oslo_concurrency.lockutils [None req-a401db17-592a-4615-9c65-aa481bd7565e 30d0a975d78c4d9a8e2201afdc040092 8ad754242d964bb487a2174b2c21bcc5 - - default default] Lock "0e051d40-6705-4ff4-b581-ea9dbbdf4a26" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.304 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.305 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5649MB free_disk=73.2497444152832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.305 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:56 compute-0 nova_compute[192810]: 2025-09-30 21:39:56.305 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:52998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:56 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.166 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.167 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.200 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.260 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.638 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.639 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.334s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 nova_compute[192810]: 2025-09-30 21:39:57.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:57 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.451 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.452 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.490 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.681 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.681 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.689 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.690 2 INFO nova.compute.claims [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 nova_compute[192810]: 2025-09-30 21:39:58.945 2 DEBUG nova.compute.provider_tree [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:58 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.053 2 DEBUG nova.scheduler.client.report [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.155 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.156 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:53996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.439 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.525 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.525 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.526 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.605 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.605 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.612 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.612 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.717 2 INFO nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 nova_compute[192810]: 2025-09-30 21:39:59.886 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:39:59 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.224 2 DEBUG nova.policy [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a8e4a8454b4d4d049dde1e287a040dfb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.458 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.459 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.459 2 INFO nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Creating image(s)
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.460 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.460 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.461 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.473 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.537 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.538 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.539 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.554 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.609 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.610 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.646 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.647 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.648 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.700 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.701 2 DEBUG nova.virt.disk.api [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Checking if we can resize image /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.701 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.755 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.756 2 DEBUG nova.virt.disk.api [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Cannot resize image /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.757 2 DEBUG nova.objects.instance [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'migration_context' on Instance uuid fedc9554-3e96-41d4-a454-562f44c2612c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.780 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.781 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Ensure instance console log exists: /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.781 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.782 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:00 compute-0 nova_compute[192810]: 2025-09-30 21:40:00.782 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:00 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 nova_compute[192810]: 2025-09-30 21:40:01.198 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Successfully created port: 54883783-36e2-4cce-a972-2657a0e6a1cd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:54974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 nova_compute[192810]: 2025-09-30 21:40:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55120 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:01 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55202 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 podman[240225]: 2025-09-30 21:40:02.314776021 +0000 UTC m=+0.050936890 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:40:02 compute-0 podman[240226]: 2025-09-30 21:40:02.322499304 +0000 UTC m=+0.053126525 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Sep 30 21:40:02 compute-0 podman[240224]: 2025-09-30 21:40:02.336863002 +0000 UTC m=+0.074447097 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 nova_compute[192810]: 2025-09-30 21:40:02.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 nova_compute[192810]: 2025-09-30 21:40:02.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55502 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55520 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55522 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55528 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55526 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55550 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55538 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55532 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55536 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55542 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55552 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55530 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55534 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55554 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55548 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55546 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55544 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55524 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55540 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55560 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55564 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55568 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55572 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55570 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55558 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55562 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55556 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55566 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55574 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:02 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55576 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55578 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55580 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55592 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55582 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55586 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55588 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55602 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55598 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55596 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55600 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55610 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55604 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55590 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55584 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55624 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55594 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55606 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55608 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55618 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55612 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55626 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55622 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55620 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55628 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55634 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55630 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55614 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55642 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55632 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55638 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55640 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55616 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55644 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55636 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55652 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55646 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55650 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55648 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55654 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55656 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55662 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55658 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55660 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55672 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55670 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55666 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55664 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55674 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55668 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55676 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55688 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55682 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55678 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55680 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55684 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55686 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55690 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55692 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.342 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Successfully updated port: 54883783-36e2-4cce-a972-2657a0e6a1cd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55694 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.397 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.397 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquired lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.397 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55702 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55714 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55710 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55724 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55720 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55722 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55728 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55718 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55746 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55732 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55726 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55734 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55738 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55748 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55744 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55712 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55698 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55716 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55704 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55706 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55736 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55756 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55700 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55696 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55730 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55708 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55742 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55740 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55750 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55752 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55754 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55758 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55760 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55762 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55764 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55782 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55768 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55784 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55798 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55790 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55774 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55766 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55776 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55770 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55796 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55792 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55780 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55778 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55786 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55788 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55794 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55800 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55772 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55804 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55806 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55810 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55812 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55802 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55820 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55818 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55822 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55816 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55824 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55814 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55828 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55830 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55808 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55832 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55826 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55836 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55834 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55838 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55840 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55842 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55850 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55852 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55844 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55846 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55848 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55866 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55860 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55854 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55858 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55862 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55856 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55864 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.703 2 DEBUG nova.compute.manager [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-changed-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.703 2 DEBUG nova.compute.manager [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Refreshing instance network info cache due to event network-changed-54883783-36e2-4cce-a972-2657a0e6a1cd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.703 2 DEBUG oslo_concurrency.lockutils [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55868 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55870 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55872 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55874 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55876 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55878 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 nova_compute[192810]: 2025-09-30 21:40:03.819 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55882 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55896 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55888 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55898 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55894 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55908 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55906 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55914 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55890 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55880 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55926 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55922 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55912 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55924 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55902 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55886 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55884 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55900 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55928 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55904 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55918 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55916 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55932 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55934 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55920 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55910 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55892 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55940 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55938 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55930 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55936 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55944 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55946 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55942 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55948 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55962 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55970 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55958 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55966 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55968 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55972 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55978 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55980 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55956 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55964 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55952 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55976 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55950 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55954 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55974 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55960 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55996 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55982 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55986 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55990 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55998 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56002 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:03 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55994 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56004 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56008 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56012 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56006 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56014 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56000 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55992 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55988 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:55984 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56010 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56018 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56020 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56022 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56016 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56024 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56026 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56030 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56028 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56032 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56038 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56034 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56036 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56048 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56050 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56040 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56046 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56042 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56044 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56054 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56052 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56056 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56060 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56062 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56058 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56072 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56066 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56070 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56064 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56068 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56074 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56078 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56076 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56104 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56080 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56082 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56086 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56090 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56102 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56126 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56100 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56108 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56106 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56114 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56084 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56096 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56128 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56124 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56098 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56116 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56110 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56088 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56112 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56122 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56092 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56118 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56094 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56140 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56142 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56144 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56146 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56134 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56136 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56150 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56148 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56154 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56132 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56138 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56130 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56162 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56164 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56174 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56168 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56172 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56184 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56158 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56156 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56180 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56160 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56166 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56182 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56176 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56186 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56152 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56178 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56170 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56198 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56200 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56196 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56192 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56190 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56188 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56194 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56204 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56208 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56210 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56206 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56212 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56218 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56220 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56214 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56216 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56222 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56236 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56234 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56230 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56224 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56226 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56238 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56228 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56240 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56232 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56242 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56244 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56250 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56248 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56246 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56252 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56254 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56258 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56256 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56260 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56266 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56262 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56264 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56276 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56268 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56274 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56290 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56272 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56286 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56282 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56294 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56296 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56270 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56298 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56308 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56300 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56304 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56280 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56302 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56292 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56314 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56288 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56306 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56318 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56284 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56326 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56328 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56324 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56322 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56320 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56312 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56316 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56278 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56330 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56332 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56310 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56336 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56340 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56338 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56346 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56352 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56342 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56360 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56344 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56362 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56356 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56364 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56348 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56354 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56334 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56358 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56350 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56368 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56370 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56366 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56372 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56376 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56378 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56384 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56382 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56388 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56390 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56374 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56380 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56386 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56392 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56394 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56398 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56396 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56400 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56402 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:04 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56404 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56414 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56418 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56416 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56406 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56422 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56410 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56420 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56408 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56424 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56412 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56426 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56428 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56432 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56430 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56434 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56436 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56440 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56438 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.207 2 DEBUG nova.network.neutron [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Updating instance_info_cache with network_info: [{"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56446 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56450 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56442 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56444 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56448 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56452 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56466 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56460 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56464 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56456 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56454 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56458 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56462 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56472 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56478 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56474 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56470 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56484 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56480 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.280 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Releasing lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.280 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Instance network_info: |[{"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.280 2 DEBUG oslo_concurrency.lockutils [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.281 2 DEBUG nova.network.neutron [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Refreshing network info cache for port 54883783-36e2-4cce-a972-2657a0e6a1cd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.283 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Start _get_guest_xml network_info=[{"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56468 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56496 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.287 2 WARNING nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56490 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56488 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56498 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56476 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56482 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.290 2 DEBUG nova.virt.libvirt.host [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56486 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56506 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.291 2 DEBUG nova.virt.libvirt.host [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.294 2 DEBUG nova.virt.libvirt.host [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56494 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.295 2 DEBUG nova.virt.libvirt.host [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.296 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.296 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.297 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.297 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.297 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.297 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.298 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56510 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.298 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.298 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.298 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.299 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.299 2 DEBUG nova.virt.hardware [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56492 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56500 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.303 2 DEBUG nova.virt.libvirt.vif [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-379171970',display_name='tempest-ServerRescueNegativeTestJSON-server-379171970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-379171970',id=131,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-khh1squp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:59Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=fedc9554-3e96-41d4-a454-562f44c2612c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.303 2 DEBUG nova.network.os_vif_util [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56504 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.304 2 DEBUG nova.network.os_vif_util [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.305 2 DEBUG nova.objects.instance [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'pci_devices' on Instance uuid fedc9554-3e96-41d4-a454-562f44c2612c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56508 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56512 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56514 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56516 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 sshd[128205]: drop connection #0 from [121.15.157.194]:56518 on [38.102.83.69]:22 penalty: failed authentication
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.347 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <uuid>fedc9554-3e96-41d4-a454-562f44c2612c</uuid>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <name>instance-00000083</name>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerRescueNegativeTestJSON-server-379171970</nova:name>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:40:05</nova:creationTime>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:user uuid="a8e4a8454b4d4d049dde1e287a040dfb">tempest-ServerRescueNegativeTestJSON-493519679-project-member</nova:user>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:project uuid="c29435f306af4eebb7d6cb5bb416037d">tempest-ServerRescueNegativeTestJSON-493519679</nova:project>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         <nova:port uuid="54883783-36e2-4cce-a972-2657a0e6a1cd">
Sep 30 21:40:05 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <system>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="serial">fedc9554-3e96-41d4-a454-562f44c2612c</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="uuid">fedc9554-3e96-41d4-a454-562f44c2612c</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </system>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <os>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </os>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <features>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </features>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.config"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:2b:8a:39"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <target dev="tap54883783-36"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/console.log" append="off"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <video>
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </video>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:40:05 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:40:05 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:40:05 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:40:05 compute-0 nova_compute[192810]: </domain>
Sep 30 21:40:05 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.349 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Preparing to wait for external event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.349 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.349 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.349 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.350 2 DEBUG nova.virt.libvirt.vif [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:39:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-379171970',display_name='tempest-ServerRescueNegativeTestJSON-server-379171970',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-379171970',id=131,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-khh1squp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:39:59Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=fedc9554-3e96-41d4-a454-562f44c2612c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.350 2 DEBUG nova.network.os_vif_util [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.351 2 DEBUG nova.network.os_vif_util [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.351 2 DEBUG os_vif [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.352 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.354 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54883783-36, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.355 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54883783-36, col_values=(('external_ids', {'iface-id': '54883783-36e2-4cce-a972-2657a0e6a1cd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2b:8a:39', 'vm-uuid': 'fedc9554-3e96-41d4-a454-562f44c2612c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:05 compute-0 NetworkManager[51733]: <info>  [1759268405.3574] manager: (tap54883783-36): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/221)
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.363 2 INFO os_vif [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36')
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.494 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.494 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.495 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] No VIF found with MAC fa:16:3e:2b:8a:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:40:05 compute-0 nova_compute[192810]: 2025-09-30 21:40:05.495 2 INFO nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Using config drive
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.471 2 INFO nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Creating config drive at /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.config
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.480 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0d3abdeq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.611 2 DEBUG oslo_concurrency.processutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0d3abdeq" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:06 compute-0 kernel: tap54883783-36: entered promiscuous mode
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.6786] manager: (tap54883783-36): new Tun device (/org/freedesktop/NetworkManager/Devices/222)
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:06 compute-0 ovn_controller[94912]: 2025-09-30T21:40:06Z|00506|binding|INFO|Claiming lport 54883783-36e2-4cce-a972-2657a0e6a1cd for this chassis.
Sep 30 21:40:06 compute-0 ovn_controller[94912]: 2025-09-30T21:40:06Z|00507|binding|INFO|54883783-36e2-4cce-a972-2657a0e6a1cd: Claiming fa:16:3e:2b:8a:39 10.100.0.13
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.690 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:8a:39 10.100.0.13'], port_security=['fa:16:3e:2b:8a:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fedc9554-3e96-41d4-a454-562f44c2612c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=54883783-36e2-4cce-a972-2657a0e6a1cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.691 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 54883783-36e2-4cce-a972-2657a0e6a1cd in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a bound to our chassis
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.692 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.704 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[288f2149-2292-46ad-b36b-1ab4bedcf38a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.705 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9f7a3c1e-01 in ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.707 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9f7a3c1e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.707 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9145f6-8855-4ad0-acec-0888337f8a6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.708 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[00651383-8910-4278-9bde-d70a6dacc6e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 systemd-udevd[240305]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:06 compute-0 systemd-machined[152794]: New machine qemu-65-instance-00000083.
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.720 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[8dcd9401-1233-4090-b6ee-a7dae4228d9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.7259] device (tap54883783-36): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.7270] device (tap54883783-36): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:40:06 compute-0 systemd[1]: Started Virtual Machine qemu-65-instance-00000083.
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.746 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f522ff5-5b42-4183-852a-517086aa93f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.773 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c87b54f7-90ca-49ef-8257-054dea98bd03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.777 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cbd33f-84a5-4d8e-a8c4-6de1ffe94a4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 systemd-udevd[240309]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:40:06 compute-0 ovn_controller[94912]: 2025-09-30T21:40:06Z|00508|binding|INFO|Setting lport 54883783-36e2-4cce-a972-2657a0e6a1cd ovn-installed in OVS
Sep 30 21:40:06 compute-0 ovn_controller[94912]: 2025-09-30T21:40:06Z|00509|binding|INFO|Setting lport 54883783-36e2-4cce-a972-2657a0e6a1cd up in Southbound
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.7869] manager: (tap9f7a3c1e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/223)
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.806 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[12cd6447-74e0-4e3e-be96-1e345f20b9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.809 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0a95b091-b55f-4643-b0c2-287b74bc8854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.8304] device (tap9f7a3c1e-00): carrier: link connected
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.837 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d698138d-ae2a-4b3b-89c4-295edf65c29e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.853 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8a50d44b-e563-47e5-b8df-04ab0d84ce8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510245, 'reachable_time': 44803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240338, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.867 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ad026720-8b0d-4ecc-a015-48e0b3b0ae5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:dba1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510245, 'tstamp': 510245}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 240339, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.880 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[47d1f4af-c1b1-4732-a9e3-35d8481a06b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9f7a3c1e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:db:a1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 154], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510245, 'reachable_time': 44803, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 240340, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.903 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e59effb5-9814-414f-b431-1d4ee149ec9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.956 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1cfca145-cd03-4b98-af95-1288765c6096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.958 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.958 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.958 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f7a3c1e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:06 compute-0 NetworkManager[51733]: <info>  [1759268406.9607] manager: (tap9f7a3c1e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/224)
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:06 compute-0 kernel: tap9f7a3c1e-00: entered promiscuous mode
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.964 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9f7a3c1e-00, col_values=(('external_ids', {'iface-id': 'b7e0b2bc-8210-4854-9253-4d8208499194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:06 compute-0 ovn_controller[94912]: 2025-09-30T21:40:06Z|00510|binding|INFO|Releasing lport b7e0b2bc-8210-4854-9253-4d8208499194 from this chassis (sb_readonly=0)
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.968 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.969 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1c35c8-d02f-4e17-8dd3-b4bc29d595f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.969 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.pid.haproxy
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:40:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:06.970 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'env', 'PROCESS_TAG=haproxy-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:40:06 compute-0 nova_compute[192810]: 2025-09-30 21:40:06.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:07 compute-0 podman[240379]: 2025-09-30 21:40:07.289327381 +0000 UTC m=+0.046898090 container create a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.320 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268392.3190134, 0e051d40-6705-4ff4-b581-ea9dbbdf4a26 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.321 2 INFO nova.compute.manager [-] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] VM Stopped (Lifecycle Event)
Sep 30 21:40:07 compute-0 systemd[1]: Started libpod-conmon-a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6.scope.
Sep 30 21:40:07 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:40:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f8d467bda04dd6e2266647fc9a5ea3b4dcce68dea1cd9c80cc42ccc93d90804/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:40:07 compute-0 podman[240379]: 2025-09-30 21:40:07.357209783 +0000 UTC m=+0.114780512 container init a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:40:07 compute-0 podman[240379]: 2025-09-30 21:40:07.264090932 +0000 UTC m=+0.021661671 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:40:07 compute-0 podman[240379]: 2025-09-30 21:40:07.363009637 +0000 UTC m=+0.120580346 container start a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:40:07 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [NOTICE]   (240398) : New worker (240400) forked
Sep 30 21:40:07 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [NOTICE]   (240398) : Loading success.
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.396 2 DEBUG nova.compute.manager [None req-01747330-4893-49c8-9d9e-afc677f52c12 - - - - - -] [instance: 0e051d40-6705-4ff4-b581-ea9dbbdf4a26] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.489 2 DEBUG nova.network.neutron [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Updated VIF entry in instance network info cache for port 54883783-36e2-4cce-a972-2657a0e6a1cd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.489 2 DEBUG nova.network.neutron [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Updating instance_info_cache with network_info: [{"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.578 2 DEBUG oslo_concurrency.lockutils [req-e949e220-7339-4d13-8862-ed3f21a24d2f req-99a081fd-23f3-4f05-9568-7504ed9ef552 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-fedc9554-3e96-41d4-a454-562f44c2612c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.579 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268407.5769641, fedc9554-3e96-41d4-a454-562f44c2612c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.579 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Started (Lifecycle Event)
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.628 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.632 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268407.5771785, fedc9554-3e96-41d4-a454-562f44c2612c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.633 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Paused (Lifecycle Event)
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.655 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.658 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.689 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:07 compute-0 nova_compute[192810]: 2025-09-30 21:40:07.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.778 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.778 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.779 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.779 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.779 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Processing event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.779 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.780 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.780 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.780 2 DEBUG oslo_concurrency.lockutils [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.780 2 DEBUG nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] No waiting events found dispatching network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.781 2 WARNING nova.compute.manager [req-6cc232a4-49a1-4a7b-b1b6-d56561d83c15 req-4398469d-0ef4-4884-a387-2366ae5ec327 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received unexpected event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd for instance with vm_state building and task_state spawning.
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.781 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.783 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268408.7837088, fedc9554-3e96-41d4-a454-562f44c2612c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.784 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Resumed (Lifecycle Event)
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.785 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.787 2 INFO nova.virt.libvirt.driver [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Instance spawned successfully.
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.788 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.818 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.822 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.823 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.823 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.823 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.824 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.824 2 DEBUG nova.virt.libvirt.driver [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.829 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.903 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.961 2 INFO nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Took 8.50 seconds to spawn the instance on the hypervisor.
Sep 30 21:40:08 compute-0 nova_compute[192810]: 2025-09-30 21:40:08.962 2 DEBUG nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:09 compute-0 nova_compute[192810]: 2025-09-30 21:40:09.217 2 INFO nova.compute.manager [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Took 10.60 seconds to build instance.
Sep 30 21:40:09 compute-0 nova_compute[192810]: 2025-09-30 21:40:09.268 2 DEBUG oslo_concurrency.lockutils [None req-ca5a9d51-5586-4ca6-952b-8fb3f27bd1bb a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:10 compute-0 nova_compute[192810]: 2025-09-30 21:40:10.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:10.542 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:10.543 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:40:10 compute-0 nova_compute[192810]: 2025-09-30 21:40:10.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:12 compute-0 podman[240409]: 2025-09-30 21:40:12.318193754 +0000 UTC m=+0.052706495 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:40:12 compute-0 podman[240410]: 2025-09-30 21:40:12.327134287 +0000 UTC m=+0.060203412 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 21:40:12 compute-0 nova_compute[192810]: 2025-09-30 21:40:12.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:15 compute-0 nova_compute[192810]: 2025-09-30 21:40:15.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:15.545 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:17 compute-0 nova_compute[192810]: 2025-09-30 21:40:17.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:19 compute-0 podman[240455]: 2025-09-30 21:40:19.318656588 +0000 UTC m=+0.050925840 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:40:19 compute-0 podman[240453]: 2025-09-30 21:40:19.324498494 +0000 UTC m=+0.061517804 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:40:19 compute-0 podman[240454]: 2025-09-30 21:40:19.343281382 +0000 UTC m=+0.080869516 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 21:40:20 compute-0 nova_compute[192810]: 2025-09-30 21:40:20.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:20 compute-0 ovn_controller[94912]: 2025-09-30T21:40:20Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2b:8a:39 10.100.0.13
Sep 30 21:40:20 compute-0 ovn_controller[94912]: 2025-09-30T21:40:20Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2b:8a:39 10.100.0.13
Sep 30 21:40:22 compute-0 nova_compute[192810]: 2025-09-30 21:40:22.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:25 compute-0 nova_compute[192810]: 2025-09-30 21:40:25.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:27 compute-0 nova_compute[192810]: 2025-09-30 21:40:27.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:30 compute-0 nova_compute[192810]: 2025-09-30 21:40:30.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:32 compute-0 nova_compute[192810]: 2025-09-30 21:40:32.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:33 compute-0 podman[240527]: 2025-09-30 21:40:33.326250505 +0000 UTC m=+0.060217792 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:40:33 compute-0 podman[240528]: 2025-09-30 21:40:33.336322536 +0000 UTC m=+0.065648367 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:40:33 compute-0 podman[240526]: 2025-09-30 21:40:33.356395697 +0000 UTC m=+0.091300127 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.508 2 INFO nova.compute.manager [None req-93df4739-fbe7-45e8-81c0-b2f44843214a a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Pausing
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.510 2 DEBUG nova.objects.instance [None req-93df4739-fbe7-45e8-81c0-b2f44843214a a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'flavor' on Instance uuid fedc9554-3e96-41d4-a454-562f44c2612c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.576 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268434.5760815, fedc9554-3e96-41d4-a454-562f44c2612c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.576 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Paused (Lifecycle Event)
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.579 2 DEBUG nova.compute.manager [None req-93df4739-fbe7-45e8-81c0-b2f44843214a a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.619 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.621 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:34 compute-0 nova_compute[192810]: 2025-09-30 21:40:34.873 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] During sync_power_state the instance has a pending task (pausing). Skip.
Sep 30 21:40:35 compute-0 nova_compute[192810]: 2025-09-30 21:40:35.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:35 compute-0 sshd[128205]: exited MaxStartups throttling after 00:00:58, 117 connections dropped
Sep 30 21:40:36 compute-0 sshd-session[240590]: Invalid user oracle from 45.81.23.80 port 54876
Sep 30 21:40:36 compute-0 sshd-session[240590]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:40:36 compute-0 sshd-session[240590]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.634 2 INFO nova.compute.manager [None req-58e0ef6f-50e2-46b8-8608-6661242e3cf0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Unpausing
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.635 2 DEBUG nova.objects.instance [None req-58e0ef6f-50e2-46b8-8608-6661242e3cf0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'flavor' on Instance uuid fedc9554-3e96-41d4-a454-562f44c2612c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.706 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268437.7066758, fedc9554-3e96-41d4-a454-562f44c2612c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.707 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Resumed (Lifecycle Event)
Sep 30 21:40:37 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.711 2 DEBUG nova.virt.libvirt.guest [None req-58e0ef6f-50e2-46b8-8608-6661242e3cf0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.711 2 DEBUG nova.compute.manager [None req-58e0ef6f-50e2-46b8-8608-6661242e3cf0 a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.751 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.754 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:40:37 compute-0 nova_compute[192810]: 2025-09-30 21:40:37.805 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] During sync_power_state the instance has a pending task (unpausing). Skip.
Sep 30 21:40:37 compute-0 sshd-session[240590]: Failed password for invalid user oracle from 45.81.23.80 port 54876 ssh2
Sep 30 21:40:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:38.746 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:38.746 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:38.747 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:39 compute-0 sshd-session[240590]: Received disconnect from 45.81.23.80 port 54876:11: Bye Bye [preauth]
Sep 30 21:40:39 compute-0 sshd-session[240590]: Disconnected from invalid user oracle 45.81.23.80 port 54876 [preauth]
Sep 30 21:40:40 compute-0 nova_compute[192810]: 2025-09-30 21:40:40.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.918 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.919 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.919 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.919 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.919 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.930 2 INFO nova.compute.manager [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Terminating instance
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.942 2 DEBUG nova.compute.manager [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:40:42 compute-0 kernel: tap54883783-36 (unregistering): left promiscuous mode
Sep 30 21:40:42 compute-0 NetworkManager[51733]: <info>  [1759268442.9681] device (tap54883783-36): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:42 compute-0 ovn_controller[94912]: 2025-09-30T21:40:42Z|00511|binding|INFO|Releasing lport 54883783-36e2-4cce-a972-2657a0e6a1cd from this chassis (sb_readonly=0)
Sep 30 21:40:42 compute-0 ovn_controller[94912]: 2025-09-30T21:40:42Z|00512|binding|INFO|Setting lport 54883783-36e2-4cce-a972-2657a0e6a1cd down in Southbound
Sep 30 21:40:42 compute-0 ovn_controller[94912]: 2025-09-30T21:40:42Z|00513|binding|INFO|Removing iface tap54883783-36 ovn-installed in OVS
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:42.984 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:8a:39 10.100.0.13'], port_security=['fa:16:3e:2b:8a:39 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'fedc9554-3e96-41d4-a454-562f44c2612c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c29435f306af4eebb7d6cb5bb416037d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a32e3a5-ee38-4fae-9fbe-f0444b488d34', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c3ff2921-853e-4756-b8d5-05a55aa79dbf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=54883783-36e2-4cce-a972-2657a0e6a1cd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:40:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:42.985 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 54883783-36e2-4cce-a972-2657a0e6a1cd in datapath 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a unbound from our chassis
Sep 30 21:40:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:42.986 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:40:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:42.988 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fad0d2d3-f50d-4984-905a-df7e54084a2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:42.988 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a namespace which is not needed anymore
Sep 30 21:40:42 compute-0 nova_compute[192810]: 2025-09-30 21:40:42.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000083.scope: Deactivated successfully.
Sep 30 21:40:43 compute-0 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000083.scope: Consumed 13.507s CPU time.
Sep 30 21:40:43 compute-0 systemd-machined[152794]: Machine qemu-65-instance-00000083 terminated.
Sep 30 21:40:43 compute-0 podman[240597]: 2025-09-30 21:40:43.076675451 +0000 UTC m=+0.066541818 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:40:43 compute-0 podman[240594]: 2025-09-30 21:40:43.109271564 +0000 UTC m=+0.093159943 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:40:43 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [NOTICE]   (240398) : haproxy version is 2.8.14-c23fe91
Sep 30 21:40:43 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [NOTICE]   (240398) : path to executable is /usr/sbin/haproxy
Sep 30 21:40:43 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [WARNING]  (240398) : Exiting Master process...
Sep 30 21:40:43 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [ALERT]    (240398) : Current worker (240400) exited with code 143 (Terminated)
Sep 30 21:40:43 compute-0 neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a[240394]: [WARNING]  (240398) : All workers exited. Exiting... (0)
Sep 30 21:40:43 compute-0 systemd[1]: libpod-a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6.scope: Deactivated successfully.
Sep 30 21:40:43 compute-0 podman[240650]: 2025-09-30 21:40:43.123068248 +0000 UTC m=+0.046552592 container died a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:40:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6-userdata-shm.mount: Deactivated successfully.
Sep 30 21:40:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-6f8d467bda04dd6e2266647fc9a5ea3b4dcce68dea1cd9c80cc42ccc93d90804-merged.mount: Deactivated successfully.
Sep 30 21:40:43 compute-0 podman[240650]: 2025-09-30 21:40:43.158216874 +0000 UTC m=+0.081701208 container cleanup a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:40:43 compute-0 systemd[1]: libpod-conmon-a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6.scope: Deactivated successfully.
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.206 2 INFO nova.virt.libvirt.driver [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Instance destroyed successfully.
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.207 2 DEBUG nova.objects.instance [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lazy-loading 'resources' on Instance uuid fedc9554-3e96-41d4-a454-562f44c2612c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.220 2 DEBUG nova.virt.libvirt.vif [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:39:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-379171970',display_name='tempest-ServerRescueNegativeTestJSON-server-379171970',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-379171970',id=131,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:40:08Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c29435f306af4eebb7d6cb5bb416037d',ramdisk_id='',reservation_id='r-khh1squp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-493519679',owner_user_name='tempest-ServerRescueNegativeTestJSON-493519679-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:40:37Z,user_data=None,user_id='a8e4a8454b4d4d049dde1e287a040dfb',uuid=fedc9554-3e96-41d4-a454-562f44c2612c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.221 2 DEBUG nova.network.os_vif_util [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converting VIF {"id": "54883783-36e2-4cce-a972-2657a0e6a1cd", "address": "fa:16:3e:2b:8a:39", "network": {"id": "9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2062261091-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c29435f306af4eebb7d6cb5bb416037d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54883783-36", "ovs_interfaceid": "54883783-36e2-4cce-a972-2657a0e6a1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.221 2 DEBUG nova.network.os_vif_util [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.222 2 DEBUG os_vif [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54883783-36, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.230 2 INFO os_vif [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2b:8a:39,bridge_name='br-int',has_traffic_filtering=True,id=54883783-36e2-4cce-a972-2657a0e6a1cd,network=Network(9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54883783-36')
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.230 2 INFO nova.virt.libvirt.driver [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Deleting instance files /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c_del
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.231 2 INFO nova.virt.libvirt.driver [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Deletion of /var/lib/nova/instances/fedc9554-3e96-41d4-a454-562f44c2612c_del complete
Sep 30 21:40:43 compute-0 podman[240696]: 2025-09-30 21:40:43.240939976 +0000 UTC m=+0.055082534 container remove a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.246 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0464bcb5-ab39-48e9-86de-9a300912a09e]: (4, ('Tue Sep 30 09:40:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6)\na746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6\nTue Sep 30 09:40:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a (a746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6)\na746e39c0b13cb7276e5fb21dab6bf193731bb29218251aa017a4bce94fe21f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.247 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2a6262d0-9830-46e4-bc3e-814aa960ca1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.248 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f7a3c1e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 kernel: tap9f7a3c1e-00: left promiscuous mode
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.264 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[13a12e84-4877-49b5-95cc-cbfd8f8fe49e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.290 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[70bd23a1-170f-4d04-aabe-cf6b800c9eb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.292 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[277fd345-c350-4511-8c7a-e9e57d33b4bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.317 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[acdebd26-128a-4a77-a5f2-cadfc265edd2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510239, 'reachable_time': 25454, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 240721, 'error': None, 'target': 'ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.319 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9f7a3c1e-01ae-4ec3-a5e2-23ef8435e53a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:40:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:40:43.319 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[63432d30-486d-42e2-9554-3d19ffee6cfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:40:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d9f7a3c1e\x2d01ae\x2d4ec3\x2da5e2\x2d23ef8435e53a.mount: Deactivated successfully.
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.442 2 INFO nova.compute.manager [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Took 0.50 seconds to destroy the instance on the hypervisor.
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.443 2 DEBUG oslo.service.loopingcall [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.443 2 DEBUG nova.compute.manager [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:40:43 compute-0 nova_compute[192810]: 2025-09-30 21:40:43.443 2 DEBUG nova.network.neutron [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.908 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.909 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:40:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.082 2 DEBUG nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-unplugged-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.083 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.083 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.083 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.084 2 DEBUG nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] No waiting events found dispatching network-vif-unplugged-54883783-36e2-4cce-a972-2657a0e6a1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.084 2 DEBUG nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-unplugged-54883783-36e2-4cce-a972-2657a0e6a1cd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.084 2 DEBUG nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.084 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.084 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.085 2 DEBUG oslo_concurrency.lockutils [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.085 2 DEBUG nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] No waiting events found dispatching network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.085 2 WARNING nova.compute.manager [req-60cdd996-ca4b-444e-9cd9-ea298dc35246 req-284f556d-6452-4293-8c85-cc8904c34165 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received unexpected event network-vif-plugged-54883783-36e2-4cce-a972-2657a0e6a1cd for instance with vm_state active and task_state deleting.
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.451 2 DEBUG nova.network.neutron [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.480 2 INFO nova.compute.manager [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Took 1.04 seconds to deallocate network for instance.
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.923 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:44 compute-0 nova_compute[192810]: 2025-09-30 21:40:44.923 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.250 2 DEBUG nova.compute.provider_tree [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.335 2 DEBUG nova.scheduler.client.report [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.404 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.436 2 INFO nova.scheduler.client.report [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Deleted allocations for instance fedc9554-3e96-41d4-a454-562f44c2612c
Sep 30 21:40:45 compute-0 nova_compute[192810]: 2025-09-30 21:40:45.567 2 DEBUG oslo_concurrency.lockutils [None req-ebaedcbd-753b-49e6-a7fc-dc3f48a3818e a8e4a8454b4d4d049dde1e287a040dfb c29435f306af4eebb7d6cb5bb416037d - - default default] Lock "fedc9554-3e96-41d4-a454-562f44c2612c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:46 compute-0 nova_compute[192810]: 2025-09-30 21:40:46.286 2 DEBUG nova.compute.manager [req-83a1e63c-875a-48c6-a597-f668da795147 req-8c5f83e4-3d18-42a3-9e81-652541dc5091 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Received event network-vif-deleted-54883783-36e2-4cce-a972-2657a0e6a1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:40:47 compute-0 nova_compute[192810]: 2025-09-30 21:40:47.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:48 compute-0 nova_compute[192810]: 2025-09-30 21:40:48.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:50 compute-0 podman[240724]: 2025-09-30 21:40:50.335049575 +0000 UTC m=+0.064423907 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Sep 30 21:40:50 compute-0 podman[240726]: 2025-09-30 21:40:50.358792807 +0000 UTC m=+0.081727238 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:40:50 compute-0 podman[240725]: 2025-09-30 21:40:50.359811612 +0000 UTC m=+0.084866916 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:40:52 compute-0 nova_compute[192810]: 2025-09-30 21:40:52.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:53 compute-0 nova_compute[192810]: 2025-09-30 21:40:53.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:53 compute-0 nova_compute[192810]: 2025-09-30 21:40:53.873 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:54 compute-0 nova_compute[192810]: 2025-09-30 21:40:54.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:55 compute-0 nova_compute[192810]: 2025-09-30 21:40:55.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:55 compute-0 nova_compute[192810]: 2025-09-30 21:40:55.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:55 compute-0 nova_compute[192810]: 2025-09-30 21:40:55.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:40:56 compute-0 nova_compute[192810]: 2025-09-30 21:40:56.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.217 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.218 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.218 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.218 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.429 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.430 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.249267578125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.431 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.431 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.715 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.716 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.749 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.772 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.798 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:40:57 compute-0 nova_compute[192810]: 2025-09-30 21:40:57.799 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.205 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268443.2034302, fedc9554-3e96-41d4-a454-562f44c2612c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.205 2 INFO nova.compute.manager [-] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] VM Stopped (Lifecycle Event)
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.228 2 DEBUG nova.compute.manager [None req-866c6c8f-9f2d-4f5e-8cf8-4a35f0a11f25 - - - - - -] [instance: fedc9554-3e96-41d4-a454-562f44c2612c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.689 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.690 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.711 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.795 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.796 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.796 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.796 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.812 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.820 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.820 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.831 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.832 2 INFO nova.compute.claims [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.956 2 DEBUG nova.compute.provider_tree [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:40:58 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.971 2 DEBUG nova.scheduler.client.report [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:58.999 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.000 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.074 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.089 2 INFO nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.120 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.292 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.293 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.293 2 INFO nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating image(s)
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.294 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.294 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.295 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.309 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.398 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.400 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.401 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.414 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.483 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.484 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.537 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.538 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.539 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.593 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.595 2 DEBUG nova.virt.disk.api [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Checking if we can resize image /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.595 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.651 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.653 2 DEBUG nova.virt.disk.api [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Cannot resize image /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.653 2 DEBUG nova.objects.instance [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'migration_context' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:59 compute-0 sshd-session[240723]: error: kex_exchange_identification: read: Connection timed out
Sep 30 21:40:59 compute-0 sshd-session[240723]: banner exchange: Connection from 113.240.110.90 port 43908: Connection timed out
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.675 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.675 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Ensure instance console log exists: /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.676 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.677 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.678 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.681 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.688 2 WARNING nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.694 2 DEBUG nova.virt.libvirt.host [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.695 2 DEBUG nova.virt.libvirt.host [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.698 2 DEBUG nova.virt.libvirt.host [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.698 2 DEBUG nova.virt.libvirt.host [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.700 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.700 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.700 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.701 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.701 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.701 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.701 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.701 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.702 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.702 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.702 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.702 2 DEBUG nova.virt.hardware [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.706 2 DEBUG nova.objects.instance [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.724 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <uuid>bd84e045-cedd-485d-b4fd-a9a4f71f4def</uuid>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <name>instance-00000086</name>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerShowV254Test-server-1608700653</nova:name>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:40:59</nova:creationTime>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:user uuid="2d63212fa5984ade95ed9114af14d4b6">tempest-ServerShowV254Test-942414671-project-member</nova:user>
Sep 30 21:40:59 compute-0 nova_compute[192810]:         <nova:project uuid="3625408e99424e90a46aead08babcc11">tempest-ServerShowV254Test-942414671</nova:project>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <system>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="serial">bd84e045-cedd-485d-b4fd-a9a4f71f4def</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="uuid">bd84e045-cedd-485d-b4fd-a9a4f71f4def</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </system>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <os>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </os>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <features>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </features>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/console.log" append="off"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <video>
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </video>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:40:59 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:40:59 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:40:59 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:40:59 compute-0 nova_compute[192810]: </domain>
Sep 30 21:40:59 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.783 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.784 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:40:59 compute-0 nova_compute[192810]: 2025-09-30 21:40:59.785 2 INFO nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Using config drive
Sep 30 21:41:00 compute-0 nova_compute[192810]: 2025-09-30 21:41:00.198 2 INFO nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating config drive at /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config
Sep 30 21:41:00 compute-0 nova_compute[192810]: 2025-09-30 21:41:00.202 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3r07cnbb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:00 compute-0 nova_compute[192810]: 2025-09-30 21:41:00.328 2 DEBUG oslo_concurrency.processutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3r07cnbb" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:00 compute-0 systemd-machined[152794]: New machine qemu-66-instance-00000086.
Sep 30 21:41:00 compute-0 systemd[1]: Started Virtual Machine qemu-66-instance-00000086.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.142 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268461.1414557, bd84e045-cedd-485d-b4fd-a9a4f71f4def => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.143 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] VM Resumed (Lifecycle Event)
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.145 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.145 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.149 2 INFO nova.virt.libvirt.driver [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance spawned successfully.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.149 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.181 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.183 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.191 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.192 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.192 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.192 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.193 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.193 2 DEBUG nova.virt.libvirt.driver [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.224 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.224 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268461.1421485, bd84e045-cedd-485d-b4fd-a9a4f71f4def => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.225 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] VM Started (Lifecycle Event)
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.256 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.260 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.288 2 INFO nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Took 2.00 seconds to spawn the instance on the hypervisor.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.289 2 DEBUG nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.293 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.414 2 INFO nova.compute.manager [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Took 2.62 seconds to build instance.
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.438 2 DEBUG oslo_concurrency.lockutils [None req-1bfa8088-3293-4d73-904c-c4c9a1c6b661 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 2.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:01 compute-0 nova_compute[192810]: 2025-09-30 21:41:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:01 compute-0 unix_chkpwd[240830]: password check failed for user (root)
Sep 30 21:41:01 compute-0 sshd-session[240804]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 21:41:02 compute-0 nova_compute[192810]: 2025-09-30 21:41:02.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:03 compute-0 nova_compute[192810]: 2025-09-30 21:41:03.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:03 compute-0 nova_compute[192810]: 2025-09-30 21:41:03.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:04 compute-0 sshd-session[240804]: Failed password for root from 80.94.95.116 port 43806 ssh2
Sep 30 21:41:04 compute-0 podman[240832]: 2025-09-30 21:41:04.328389837 +0000 UTC m=+0.055202395 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:41:04 compute-0 podman[240833]: 2025-09-30 21:41:04.335728137 +0000 UTC m=+0.062635298 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 21:41:04 compute-0 podman[240831]: 2025-09-30 21:41:04.355254037 +0000 UTC m=+0.086033203 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:41:04 compute-0 nova_compute[192810]: 2025-09-30 21:41:04.428 2 INFO nova.compute.manager [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Rebuilding instance
Sep 30 21:41:04 compute-0 nova_compute[192810]: 2025-09-30 21:41:04.925 2 DEBUG nova.compute.manager [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:04 compute-0 nova_compute[192810]: 2025-09-30 21:41:04.996 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'pci_requests' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.009 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.022 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'resources' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.033 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'migration_context' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.045 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.048 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:41:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:05.141 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:41:05 compute-0 nova_compute[192810]: 2025-09-30 21:41:05.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:05.143 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:41:05 compute-0 sshd-session[240804]: Connection closed by authenticating user root 80.94.95.116 port 43806 [preauth]
Sep 30 21:41:07 compute-0 nova_compute[192810]: 2025-09-30 21:41:07.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:08 compute-0 nova_compute[192810]: 2025-09-30 21:41:08.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:12 compute-0 nova_compute[192810]: 2025-09-30 21:41:12.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:13 compute-0 nova_compute[192810]: 2025-09-30 21:41:13.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:13 compute-0 podman[240913]: 2025-09-30 21:41:13.315249353 +0000 UTC m=+0.051332060 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:41:13 compute-0 podman[240914]: 2025-09-30 21:41:13.354049216 +0000 UTC m=+0.073904785 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 21:41:15 compute-0 nova_compute[192810]: 2025-09-30 21:41:15.095 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101
Sep 30 21:41:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:15.145 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:41:17 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000086.scope: Deactivated successfully.
Sep 30 21:41:17 compute-0 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000086.scope: Consumed 12.576s CPU time.
Sep 30 21:41:17 compute-0 systemd-machined[152794]: Machine qemu-66-instance-00000086 terminated.
Sep 30 21:41:17 compute-0 nova_compute[192810]: 2025-09-30 21:41:17.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.112 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance shutdown successfully after 13 seconds.
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.116 2 INFO nova.virt.libvirt.driver [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance destroyed successfully.
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.119 2 INFO nova.virt.libvirt.driver [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance destroyed successfully.
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.120 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Deleting instance files /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def_del
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.121 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Deletion of /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def_del complete
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.338 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.338 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating image(s)
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.339 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.339 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.340 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.352 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.405 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.406 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.407 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.418 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.512 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.513 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.545 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e,backing_fmt=raw /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.546 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "d794a27f8e0bfa3eee9759fbfddd316a7671c61e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.547 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.598 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.599 2 DEBUG nova.virt.disk.api [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Checking if we can resize image /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.600 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.655 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.656 2 DEBUG nova.virt.disk.api [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Cannot resize image /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.656 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.657 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Ensure instance console log exists: /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.657 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.658 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.658 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.660 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.665 2 WARNING nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.673 2 DEBUG nova.virt.libvirt.host [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.673 2 DEBUG nova.virt.libvirt.host [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.676 2 DEBUG nova.virt.libvirt.host [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.677 2 DEBUG nova.virt.libvirt.host [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.678 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.678 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:11Z,direct_url=<?>,disk_format='qcow2',id=29834554-3ec3-4459-bfde-932aa778e979,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:13Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.679 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.679 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.679 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.679 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.680 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.680 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.680 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.680 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.680 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.681 2 DEBUG nova.virt.hardware [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.681 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'vcpu_model' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:18 compute-0 nova_compute[192810]: 2025-09-30 21:41:18.906 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <uuid>bd84e045-cedd-485d-b4fd-a9a4f71f4def</uuid>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <name>instance-00000086</name>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:name>tempest-ServerShowV254Test-server-1608700653</nova:name>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:41:18</nova:creationTime>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:user uuid="2d63212fa5984ade95ed9114af14d4b6">tempest-ServerShowV254Test-942414671-project-member</nova:user>
Sep 30 21:41:18 compute-0 nova_compute[192810]:         <nova:project uuid="3625408e99424e90a46aead08babcc11">tempest-ServerShowV254Test-942414671</nova:project>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="29834554-3ec3-4459-bfde-932aa778e979"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <nova:ports/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <system>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="serial">bd84e045-cedd-485d-b4fd-a9a4f71f4def</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="uuid">bd84e045-cedd-485d-b4fd-a9a4f71f4def</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </system>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <os>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </os>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <features>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </features>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/console.log" append="off"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <video>
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </video>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:41:18 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:41:18 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:41:18 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:41:18 compute-0 nova_compute[192810]: </domain>
Sep 30 21:41:18 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.014 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.015 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.015 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Using config drive
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.028 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'ec2_ids' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.363 2 INFO nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Creating config drive at /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.367 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9t11tetf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:41:19 compute-0 nova_compute[192810]: 2025-09-30 21:41:19.489 2 DEBUG oslo_concurrency.processutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9t11tetf" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:41:19 compute-0 systemd-machined[152794]: New machine qemu-67-instance-00000086.
Sep 30 21:41:19 compute-0 systemd[1]: Started Virtual Machine qemu-67-instance-00000086.
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.179 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for bd84e045-cedd-485d-b4fd-a9a4f71f4def due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.180 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268480.179137, bd84e045-cedd-485d-b4fd-a9a4f71f4def => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.181 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] VM Resumed (Lifecycle Event)
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.184 2 DEBUG nova.compute.manager [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.184 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.187 2 INFO nova.virt.libvirt.driver [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance spawned successfully.
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.188 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.232 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.235 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.247 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.247 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.247 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.248 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.248 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.249 2 DEBUG nova.virt.libvirt.driver [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.259 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.259 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268480.1804597, bd84e045-cedd-485d-b4fd-a9a4f71f4def => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.260 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] VM Started (Lifecycle Event)
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.288 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.292 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.330 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.357 2 DEBUG nova.compute.manager [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.443 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.444 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.444 2 DEBUG nova.objects.instance [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Sep 30 21:41:20 compute-0 nova_compute[192810]: 2025-09-30 21:41:20.536 2 DEBUG oslo_concurrency.lockutils [None req-f33615cd-a221-4924-bcff-96658eca5163 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:21 compute-0 podman[241012]: 2025-09-30 21:41:21.321615138 +0000 UTC m=+0.053727060 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:41:21 compute-0 podman[241011]: 2025-09-30 21:41:21.326906377 +0000 UTC m=+0.059874030 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:41:21 compute-0 podman[241010]: 2025-09-30 21:41:21.350040695 +0000 UTC m=+0.083684114 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.651 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.652 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.653 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.653 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.653 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.761 2 INFO nova.compute.manager [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Terminating instance
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.869 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "refresh_cache-bd84e045-cedd-485d-b4fd-a9a4f71f4def" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.869 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquired lock "refresh_cache-bd84e045-cedd-485d-b4fd-a9a4f71f4def" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:41:21 compute-0 nova_compute[192810]: 2025-09-30 21:41:21.870 2 DEBUG nova.network.neutron [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.001 2 DEBUG nova.network.neutron [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.454 2 DEBUG nova.network.neutron [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.471 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Releasing lock "refresh_cache-bd84e045-cedd-485d-b4fd-a9a4f71f4def" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.471 2 DEBUG nova.compute.manager [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:41:22 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000086.scope: Deactivated successfully.
Sep 30 21:41:22 compute-0 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000086.scope: Consumed 2.899s CPU time.
Sep 30 21:41:22 compute-0 systemd-machined[152794]: Machine qemu-67-instance-00000086 terminated.
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.715 2 INFO nova.virt.libvirt.driver [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance destroyed successfully.
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.715 2 DEBUG nova.objects.instance [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lazy-loading 'resources' on Instance uuid bd84e045-cedd-485d-b4fd-a9a4f71f4def obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.762 2 INFO nova.virt.libvirt.driver [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Deleting instance files /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def_del
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.763 2 INFO nova.virt.libvirt.driver [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Deletion of /var/lib/nova/instances/bd84e045-cedd-485d-b4fd-a9a4f71f4def_del complete
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.933 2 INFO nova.compute.manager [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Took 0.46 seconds to destroy the instance on the hypervisor.
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.934 2 DEBUG oslo.service.loopingcall [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.934 2 DEBUG nova.compute.manager [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:41:22 compute-0 nova_compute[192810]: 2025-09-30 21:41:22.935 2 DEBUG nova.network.neutron [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.165 2 DEBUG nova.network.neutron [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.176 2 DEBUG nova.network.neutron [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.193 2 INFO nova.compute.manager [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Took 0.26 seconds to deallocate network for instance.
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.277 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.278 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.338 2 DEBUG nova.compute.provider_tree [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.350 2 DEBUG nova.scheduler.client.report [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.376 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.420 2 INFO nova.scheduler.client.report [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Deleted allocations for instance bd84e045-cedd-485d-b4fd-a9a4f71f4def
Sep 30 21:41:23 compute-0 nova_compute[192810]: 2025-09-30 21:41:23.588 2 DEBUG oslo_concurrency.lockutils [None req-5d58ee57-dd41-4a31-b602-bd1625e46c07 2d63212fa5984ade95ed9114af14d4b6 3625408e99424e90a46aead08babcc11 - - default default] Lock "bd84e045-cedd-485d-b4fd-a9a4f71f4def" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:27 compute-0 nova_compute[192810]: 2025-09-30 21:41:27.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:28 compute-0 unix_chkpwd[241079]: password check failed for user (root)
Sep 30 21:41:28 compute-0 sshd-session[241077]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=45.81.23.80  user=root
Sep 30 21:41:28 compute-0 nova_compute[192810]: 2025-09-30 21:41:28.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:29 compute-0 sshd-session[241077]: Failed password for root from 45.81.23.80 port 49722 ssh2
Sep 30 21:41:30 compute-0 sshd-session[241077]: Received disconnect from 45.81.23.80 port 49722:11: Bye Bye [preauth]
Sep 30 21:41:30 compute-0 sshd-session[241077]: Disconnected from authenticating user root 45.81.23.80 port 49722 [preauth]
Sep 30 21:41:32 compute-0 nova_compute[192810]: 2025-09-30 21:41:32.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:33 compute-0 nova_compute[192810]: 2025-09-30 21:41:33.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:35 compute-0 podman[241081]: 2025-09-30 21:41:35.317039641 +0000 UTC m=+0.054105569 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:41:35 compute-0 podman[241082]: 2025-09-30 21:41:35.341420599 +0000 UTC m=+0.064814412 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:41:35 compute-0 podman[241080]: 2025-09-30 21:41:35.34880382 +0000 UTC m=+0.088532643 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 21:41:37 compute-0 nova_compute[192810]: 2025-09-30 21:41:37.713 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268482.7123206, bd84e045-cedd-485d-b4fd-a9a4f71f4def => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:41:37 compute-0 nova_compute[192810]: 2025-09-30 21:41:37.713 2 INFO nova.compute.manager [-] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] VM Stopped (Lifecycle Event)
Sep 30 21:41:37 compute-0 nova_compute[192810]: 2025-09-30 21:41:37.740 2 DEBUG nova.compute.manager [None req-fc0de6bb-4b60-4120-938c-c0c8e01a18cc - - - - - -] [instance: bd84e045-cedd-485d-b4fd-a9a4f71f4def] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:41:37 compute-0 nova_compute[192810]: 2025-09-30 21:41:37.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:38 compute-0 nova_compute[192810]: 2025-09-30 21:41:38.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:38.746 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:38.747 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:41:38.747 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:42 compute-0 nova_compute[192810]: 2025-09-30 21:41:42.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:43 compute-0 nova_compute[192810]: 2025-09-30 21:41:43.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:44 compute-0 podman[241141]: 2025-09-30 21:41:44.328542482 +0000 UTC m=+0.062711870 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:41:44 compute-0 podman[241142]: 2025-09-30 21:41:44.329376013 +0000 UTC m=+0.059863510 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Sep 30 21:41:47 compute-0 nova_compute[192810]: 2025-09-30 21:41:47.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:48 compute-0 nova_compute[192810]: 2025-09-30 21:41:48.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:52 compute-0 podman[241186]: 2025-09-30 21:41:52.326340116 +0000 UTC m=+0.062123296 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:41:52 compute-0 podman[241184]: 2025-09-30 21:41:52.341437866 +0000 UTC m=+0.073373071 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:41:52 compute-0 podman[241185]: 2025-09-30 21:41:52.349492824 +0000 UTC m=+0.087130059 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 21:41:52 compute-0 nova_compute[192810]: 2025-09-30 21:41:52.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:53 compute-0 nova_compute[192810]: 2025-09-30 21:41:53.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:53 compute-0 nova_compute[192810]: 2025-09-30 21:41:53.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.819 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.957 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.958 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5726MB free_disk=73.24951171875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.958 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:41:56 compute-0 nova_compute[192810]: 2025-09-30 21:41:56.958 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.025 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.025 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.041 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.311 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.311 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.329 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.354 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.375 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.399 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.423 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.424 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:41:57 compute-0 nova_compute[192810]: 2025-09-30 21:41:57.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:58 compute-0 nova_compute[192810]: 2025-09-30 21:41:58.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:41:58 compute-0 nova_compute[192810]: 2025-09-30 21:41:58.424 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:58 compute-0 nova_compute[192810]: 2025-09-30 21:41:58.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:58 compute-0 nova_compute[192810]: 2025-09-30 21:41:58.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:59 compute-0 nova_compute[192810]: 2025-09-30 21:41:59.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:41:59 compute-0 nova_compute[192810]: 2025-09-30 21:41:59.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:41:59 compute-0 nova_compute[192810]: 2025-09-30 21:41:59.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:41:59 compute-0 nova_compute[192810]: 2025-09-30 21:41:59.814 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:42:01 compute-0 nova_compute[192810]: 2025-09-30 21:42:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:02 compute-0 ovn_controller[94912]: 2025-09-30T21:42:02Z|00514|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 21:42:02 compute-0 nova_compute[192810]: 2025-09-30 21:42:02.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:03 compute-0 nova_compute[192810]: 2025-09-30 21:42:03.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:03 compute-0 nova_compute[192810]: 2025-09-30 21:42:03.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:05.294 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:05 compute-0 nova_compute[192810]: 2025-09-30 21:42:05.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:05 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:05.295 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:42:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:06.297 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:06 compute-0 podman[241247]: 2025-09-30 21:42:06.314273514 +0000 UTC m=+0.048833069 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 21:42:06 compute-0 podman[241248]: 2025-09-30 21:42:06.321338468 +0000 UTC m=+0.050308256 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:42:06 compute-0 podman[241246]: 2025-09-30 21:42:06.339270648 +0000 UTC m=+0.076615522 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:42:07 compute-0 nova_compute[192810]: 2025-09-30 21:42:07.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:08 compute-0 nova_compute[192810]: 2025-09-30 21:42:08.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:12 compute-0 nova_compute[192810]: 2025-09-30 21:42:12.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:13 compute-0 nova_compute[192810]: 2025-09-30 21:42:13.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:15.083 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:2b:af 10.100.0.2 2001:db8::f816:3eff:feed:2baf'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feed:2baf/64', 'neutron:device_id': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5f2fed01-3e8a-4c9d-b394-f15b7f60366f) old=Port_Binding(mac=['fa:16:3e:ed:2b:af 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:15.084 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5f2fed01-3e8a-4c9d-b394-f15b7f60366f in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb updated
Sep 30 21:42:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:15.086 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:42:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:15.087 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[678a7e9e-dae8-4fae-bb31-49e92dacad68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:15 compute-0 podman[241304]: 2025-09-30 21:42:15.304659128 +0000 UTC m=+0.045675122 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:42:15 compute-0 podman[241305]: 2025-09-30 21:42:15.31535716 +0000 UTC m=+0.052114460 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, version=9.6, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Sep 30 21:42:17 compute-0 nova_compute[192810]: 2025-09-30 21:42:17.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:18 compute-0 nova_compute[192810]: 2025-09-30 21:42:18.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.758 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.758 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.783 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.896 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.896 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.903 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:42:22 compute-0 nova_compute[192810]: 2025-09-30 21:42:22.903 2 INFO nova.compute.claims [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.047 2 DEBUG nova.compute.provider_tree [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.064 2 DEBUG nova.scheduler.client.report [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.124 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.125 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.204 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.204 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.230 2 INFO nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.261 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:23 compute-0 podman[241348]: 2025-09-30 21:42:23.312679833 +0000 UTC m=+0.049191999 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:42:23 compute-0 podman[241349]: 2025-09-30 21:42:23.313019301 +0000 UTC m=+0.047669441 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:42:23 compute-0 podman[241347]: 2025-09-30 21:42:23.340424633 +0000 UTC m=+0.080413465 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0)
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.394 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.395 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.395 2 INFO nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Creating image(s)
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.395 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.396 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.396 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.408 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.460 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.461 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.461 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.472 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.499 2 DEBUG nova.policy [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.523 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.524 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.567 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.568 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.569 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.630 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.631 2 DEBUG nova.virt.disk.api [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.631 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.687 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.688 2 DEBUG nova.virt.disk.api [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.689 2 DEBUG nova.objects.instance [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.709 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.710 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Ensure instance console log exists: /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.711 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.711 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:23 compute-0 nova_compute[192810]: 2025-09-30 21:42:23.711 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:24 compute-0 nova_compute[192810]: 2025-09-30 21:42:24.927 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Successfully created port: 8e89083c-856f-40a4-b47f-444adcedd301 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.338 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Successfully updated port: 8e89083c-856f-40a4-b47f-444adcedd301 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.355 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.355 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.355 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.473 2 DEBUG nova.compute.manager [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-changed-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.473 2 DEBUG nova.compute.manager [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing instance network info cache due to event network-changed-8e89083c-856f-40a4-b47f-444adcedd301. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.473 2 DEBUG oslo_concurrency.lockutils [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:26 compute-0 nova_compute[192810]: 2025-09-30 21:42:26.541 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:42:27 compute-0 nova_compute[192810]: 2025-09-30 21:42:27.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:28 compute-0 nova_compute[192810]: 2025-09-30 21:42:28.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.883 2 DEBUG nova.network.neutron [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.916 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.916 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Instance network_info: |[{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.917 2 DEBUG oslo_concurrency.lockutils [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.917 2 DEBUG nova.network.neutron [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing network info cache for port 8e89083c-856f-40a4-b47f-444adcedd301 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.919 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Start _get_guest_xml network_info=[{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.924 2 WARNING nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.930 2 DEBUG nova.virt.libvirt.host [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.930 2 DEBUG nova.virt.libvirt.host [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.933 2 DEBUG nova.virt.libvirt.host [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.934 2 DEBUG nova.virt.libvirt.host [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.935 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.935 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.935 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.935 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.936 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.936 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.936 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.936 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.936 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.937 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.937 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.937 2 DEBUG nova.virt.hardware [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.940 2 DEBUG nova.virt.libvirt.vif [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-570822479',display_name='tempest-TestGettingAddress-server-570822479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-570822479',id=139,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-hlf9dxg4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:23Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3fab2833-fa74-4eb1-bef8-aa51780aa0f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.941 2 DEBUG nova.network.os_vif_util [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.941 2 DEBUG nova.network.os_vif_util [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.942 2 DEBUG nova.objects.instance [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.963 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <uuid>3fab2833-fa74-4eb1-bef8-aa51780aa0f4</uuid>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <name>instance-0000008b</name>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-570822479</nova:name>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:42:29</nova:creationTime>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         <nova:port uuid="8e89083c-856f-40a4-b47f-444adcedd301">
Sep 30 21:42:29 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe33:3285" ipVersion="6"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <system>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="serial">3fab2833-fa74-4eb1-bef8-aa51780aa0f4</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="uuid">3fab2833-fa74-4eb1-bef8-aa51780aa0f4</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </system>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <os>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </os>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <features>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </features>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.config"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:33:32:85"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <target dev="tap8e89083c-85"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/console.log" append="off"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <video>
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </video>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:42:29 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:42:29 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:42:29 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:42:29 compute-0 nova_compute[192810]: </domain>
Sep 30 21:42:29 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.965 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Preparing to wait for external event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.965 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.965 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.965 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.966 2 DEBUG nova.virt.libvirt.vif [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:42:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-570822479',display_name='tempest-TestGettingAddress-server-570822479',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-570822479',id=139,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-hlf9dxg4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:42:23Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3fab2833-fa74-4eb1-bef8-aa51780aa0f4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.966 2 DEBUG nova.network.os_vif_util [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.967 2 DEBUG nova.network.os_vif_util [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.967 2 DEBUG os_vif [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.968 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e89083c-85, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8e89083c-85, col_values=(('external_ids', {'iface-id': '8e89083c-856f-40a4-b47f-444adcedd301', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:32:85', 'vm-uuid': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:29 compute-0 NetworkManager[51733]: <info>  [1759268549.9751] manager: (tap8e89083c-85): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/225)
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:29 compute-0 nova_compute[192810]: 2025-09-30 21:42:29.981 2 INFO os_vif [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85')
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.063 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.063 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.063 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:33:32:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.064 2 INFO nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Using config drive
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.704 2 INFO nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Creating config drive at /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.config
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.713 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuohg1g1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.857 2 DEBUG oslo_concurrency.processutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuohg1g1" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:30 compute-0 kernel: tap8e89083c-85: entered promiscuous mode
Sep 30 21:42:30 compute-0 NetworkManager[51733]: <info>  [1759268550.9305] manager: (tap8e89083c-85): new Tun device (/org/freedesktop/NetworkManager/Devices/226)
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:30 compute-0 ovn_controller[94912]: 2025-09-30T21:42:30Z|00515|binding|INFO|Claiming lport 8e89083c-856f-40a4-b47f-444adcedd301 for this chassis.
Sep 30 21:42:30 compute-0 ovn_controller[94912]: 2025-09-30T21:42:30Z|00516|binding|INFO|8e89083c-856f-40a4-b47f-444adcedd301: Claiming fa:16:3e:33:32:85 10.100.0.13 2001:db8::f816:3eff:fe33:3285
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:30 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:30 compute-0 systemd-machined[152794]: New machine qemu-68-instance-0000008b.
Sep 30 21:42:30 compute-0 systemd[1]: Started Virtual Machine qemu-68-instance-0000008b.
Sep 30 21:42:30 compute-0 ovn_controller[94912]: 2025-09-30T21:42:30Z|00517|binding|INFO|Setting lport 8e89083c-856f-40a4-b47f-444adcedd301 ovn-installed in OVS
Sep 30 21:42:31 compute-0 nova_compute[192810]: 2025-09-30 21:42:30.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:31 compute-0 systemd-udevd[241441]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:42:31 compute-0 NetworkManager[51733]: <info>  [1759268551.0229] device (tap8e89083c-85): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:42:31 compute-0 NetworkManager[51733]: <info>  [1759268551.0248] device (tap8e89083c-85): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.199 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:32:85 10.100.0.13 2001:db8::f816:3eff:fe33:3285'], port_security=['fa:16:3e:33:32:85 10.100.0.13 2001:db8::f816:3eff:fe33:3285'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe33:3285/64', 'neutron:device_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8133a1ec-543c-40bd-a8db-7ecfb4b9f0ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=8e89083c-856f-40a4-b47f-444adcedd301) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.201 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 8e89083c-856f-40a4-b47f-444adcedd301 in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb bound to our chassis
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.203 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb
Sep 30 21:42:31 compute-0 ovn_controller[94912]: 2025-09-30T21:42:31Z|00518|binding|INFO|Setting lport 8e89083c-856f-40a4-b47f-444adcedd301 up in Southbound
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dff4e804-2b8f-4f9d-b0ce-ce88fb8dc3c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.224 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0670a52f-e1 in ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.226 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0670a52f-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.226 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c1e872-d736-482d-ab74-e1be22a02681]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.228 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[867aa2ed-cd74-4545-86b5-46064523133f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.240 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[66e709ec-853b-40b7-9f39-08f7735dfed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.270 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e282aa-0175-4e1f-8311-3b3930426dc0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.310 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6f86a225-5bd3-49af-9439-83a9a6203220]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.317 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a6afec8e-c638-4204-a595-c6d4c88b1454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 NetworkManager[51733]: <info>  [1759268551.3186] manager: (tap0670a52f-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/227)
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.363 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[68cac09a-9598-4a17-bf6b-e937bd9efa2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.369 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cae76653-9d51-45a5-ba7e-0fc349440fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 NetworkManager[51733]: <info>  [1759268551.3989] device (tap0670a52f-e0): carrier: link connected
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.404 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1f62a712-eff5-4118-9633-6773a6aa9aca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.427 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4894e3d5-657d-483c-86d5-7176be290683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0670a52f-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:2b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524702, 'reachable_time': 22070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241474, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.448 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[73fedfc6-ae3d-4926-872e-95a53979d22c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:2baf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524702, 'tstamp': 524702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241475, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.469 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d3124d13-52e1-49e4-ad6d-7332cd90b7ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0670a52f-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:2b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524702, 'reachable_time': 22070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241476, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.503 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8726e2c1-51a5-4dba-85da-f84601e4020d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.565 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9145ed51-137f-4825-9b33-ad35e1a4f20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.567 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0670a52f-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.567 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.567 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0670a52f-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:31 compute-0 nova_compute[192810]: 2025-09-30 21:42:31.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:31 compute-0 NetworkManager[51733]: <info>  [1759268551.5700] manager: (tap0670a52f-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/228)
Sep 30 21:42:31 compute-0 kernel: tap0670a52f-e0: entered promiscuous mode
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.571 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0670a52f-e0, col_values=(('external_ids', {'iface-id': '5f2fed01-3e8a-4c9d-b394-f15b7f60366f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:31 compute-0 ovn_controller[94912]: 2025-09-30T21:42:31Z|00519|binding|INFO|Releasing lport 5f2fed01-3e8a-4c9d-b394-f15b7f60366f from this chassis (sb_readonly=0)
Sep 30 21:42:31 compute-0 nova_compute[192810]: 2025-09-30 21:42:31.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.589 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.590 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[17f675c1-fb6a-454d-98b0-72efab0c422e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.590 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb.pid.haproxy
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:42:31 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:31.591 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'env', 'PROCESS_TAG=haproxy-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:42:32 compute-0 podman[241515]: 2025-09-30 21:42:31.963233536 +0000 UTC m=+0.034886507 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.082 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268552.081724, 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.082 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] VM Started (Lifecycle Event)
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.262 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.266 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268552.0819697, 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.267 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] VM Paused (Lifecycle Event)
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.307 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.313 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:42:32 compute-0 podman[241515]: 2025-09-30 21:42:32.322102712 +0000 UTC m=+0.393755663 container create cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.334 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:42:32 compute-0 systemd[1]: Started libpod-conmon-cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de.scope.
Sep 30 21:42:32 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:42:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d711ab402cdd4de752ae9ec0c1a0970e6d1421acffa82cd737adbcdab9abbdb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:42:32 compute-0 podman[241515]: 2025-09-30 21:42:32.415847974 +0000 UTC m=+0.487500945 container init cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:42:32 compute-0 podman[241515]: 2025-09-30 21:42:32.421353489 +0000 UTC m=+0.493006480 container start cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:42:32 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [NOTICE]   (241534) : New worker (241536) forked
Sep 30 21:42:32 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [NOTICE]   (241534) : Loading success.
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.622 2 DEBUG nova.compute.manager [req-7c64d58f-f74a-4c34-a24b-e0650593a819 req-85c25876-151b-4b90-98e2-da10df1ea876 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.622 2 DEBUG oslo_concurrency.lockutils [req-7c64d58f-f74a-4c34-a24b-e0650593a819 req-85c25876-151b-4b90-98e2-da10df1ea876 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.623 2 DEBUG oslo_concurrency.lockutils [req-7c64d58f-f74a-4c34-a24b-e0650593a819 req-85c25876-151b-4b90-98e2-da10df1ea876 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.623 2 DEBUG oslo_concurrency.lockutils [req-7c64d58f-f74a-4c34-a24b-e0650593a819 req-85c25876-151b-4b90-98e2-da10df1ea876 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.623 2 DEBUG nova.compute.manager [req-7c64d58f-f74a-4c34-a24b-e0650593a819 req-85c25876-151b-4b90-98e2-da10df1ea876 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Processing event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.624 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.627 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268552.6275268, 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.627 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] VM Resumed (Lifecycle Event)
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.629 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.631 2 INFO nova.virt.libvirt.driver [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Instance spawned successfully.
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.632 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.663 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.668 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.672 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.672 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.673 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.673 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.673 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.674 2 DEBUG nova.virt.libvirt.driver [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.716 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.813 2 INFO nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Took 9.42 seconds to spawn the instance on the hypervisor.
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.814 2 DEBUG nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:32 compute-0 nova_compute[192810]: 2025-09-30 21:42:32.981 2 INFO nova.compute.manager [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Took 10.13 seconds to build instance.
Sep 30 21:42:33 compute-0 nova_compute[192810]: 2025-09-30 21:42:33.012 2 DEBUG oslo_concurrency.lockutils [None req-a3d9c110-22d5-4a23-8bcd-f633962f14b5 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:33 compute-0 nova_compute[192810]: 2025-09-30 21:42:33.412 2 DEBUG nova.network.neutron [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updated VIF entry in instance network info cache for port 8e89083c-856f-40a4-b47f-444adcedd301. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:42:33 compute-0 nova_compute[192810]: 2025-09-30 21:42:33.413 2 DEBUG nova.network.neutron [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:33 compute-0 nova_compute[192810]: 2025-09-30 21:42:33.466 2 DEBUG oslo_concurrency.lockutils [req-a0007794-c4c7-43b7-aa56-86c14c0a293b req-bcad85ca-8079-48a5-a7ce-6d3919d8cdc5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.918 2 DEBUG nova.compute.manager [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.919 2 DEBUG oslo_concurrency.lockutils [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.919 2 DEBUG oslo_concurrency.lockutils [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.919 2 DEBUG oslo_concurrency.lockutils [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.920 2 DEBUG nova.compute.manager [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] No waiting events found dispatching network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.920 2 WARNING nova.compute.manager [req-3d200a11-4466-46b3-8702-1da340a33983 req-987db059-32be-42e8-b394-56b9760c6684 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received unexpected event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 for instance with vm_state active and task_state None.
Sep 30 21:42:34 compute-0 nova_compute[192810]: 2025-09-30 21:42:34.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:37 compute-0 podman[241546]: 2025-09-30 21:42:37.317421504 +0000 UTC m=+0.054956930 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:42:37 compute-0 podman[241547]: 2025-09-30 21:42:37.326711712 +0000 UTC m=+0.060003874 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Sep 30 21:42:37 compute-0 podman[241545]: 2025-09-30 21:42:37.354434882 +0000 UTC m=+0.093175397 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 21:42:37 compute-0 nova_compute[192810]: 2025-09-30 21:42:37.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:38.747 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:38.748 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:38.748 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:39 compute-0 nova_compute[192810]: 2025-09-30 21:42:39.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:39 compute-0 NetworkManager[51733]: <info>  [1759268559.7701] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/229)
Sep 30 21:42:39 compute-0 NetworkManager[51733]: <info>  [1759268559.7711] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Sep 30 21:42:39 compute-0 nova_compute[192810]: 2025-09-30 21:42:39.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:39 compute-0 ovn_controller[94912]: 2025-09-30T21:42:39Z|00520|binding|INFO|Releasing lport 5f2fed01-3e8a-4c9d-b394-f15b7f60366f from this chassis (sb_readonly=0)
Sep 30 21:42:39 compute-0 nova_compute[192810]: 2025-09-30 21:42:39.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:39 compute-0 nova_compute[192810]: 2025-09-30 21:42:39.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:41 compute-0 nova_compute[192810]: 2025-09-30 21:42:41.987 2 DEBUG nova.compute.manager [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-changed-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:42:41 compute-0 nova_compute[192810]: 2025-09-30 21:42:41.987 2 DEBUG nova.compute.manager [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing instance network info cache due to event network-changed-8e89083c-856f-40a4-b47f-444adcedd301. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:42:41 compute-0 nova_compute[192810]: 2025-09-30 21:42:41.988 2 DEBUG oslo_concurrency.lockutils [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:42:41 compute-0 nova_compute[192810]: 2025-09-30 21:42:41.988 2 DEBUG oslo_concurrency.lockutils [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:42:41 compute-0 nova_compute[192810]: 2025-09-30 21:42:41.988 2 DEBUG nova.network.neutron [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing network info cache for port 8e89083c-856f-40a4-b47f-444adcedd301 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:42:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:42.690 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:42:42 compute-0 nova_compute[192810]: 2025-09-30 21:42:42.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:42.692 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:42:42 compute-0 nova_compute[192810]: 2025-09-30 21:42:42.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:43 compute-0 nova_compute[192810]: 2025-09-30 21:42:43.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.910 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'name': 'tempest-TestGettingAddress-server-570822479', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000008b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'hostId': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.914 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 / tap8e89083c-85 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.914 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dc1c8cd-5657-4c12-a142-e06858d9efcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.911490', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65abe532-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'c4be70ddcc60220dfdc79c0ae546accdaf8f79d8e8e71310e035d9c4c2edfecf'}]}, 'timestamp': '2025-09-30 21:42:43.915250', '_unique_id': '8e8724726f304b63bd771a2c15f31636'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.916 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.932 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.932 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7211def7-bb5c-4947-a3cd-863baa0fba75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.917204', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65aea240-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'b63eb83e9a8f74f25bfbbf9d528382a72879a1ad39acc08847236ca55c55e457'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.917204', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65aeac86-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': '6f3455b70ce32a51f5875a88768dc8f81b6fe76329288ebea2b1e7697f36bbfb'}]}, 'timestamp': '2025-09-30 21:42:43.933202', '_unique_id': 'd4ae1674c2144dd1845bd6220745f2cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.934 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4662319-6ea0-45e3-89b0-8542cf8b74df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.934766', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65aef2ae-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'b6d91daf4afbd95e1b4426742d387993147c20c9249131d40542d2994f353083'}]}, 'timestamp': '2025-09-30 21:42:43.935001', '_unique_id': '747b06abfdd140edbbd2b16d93a78560'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.935 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>]
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.latency volume: 3673050352 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.936 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a6ce646-9746-4a37-a93c-5d6c5745b69d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3673050352, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.936526', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65af37fa-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'abfd75a1c813394730c7a8ffa8e762c1634764b7309f6b141de597756070daa7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.936526', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65af3fe8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'a6e2d87b063a43b05f27456c80109ddb097baeab120022183472e6b971116e35'}]}, 'timestamp': '2025-09-30 21:42:43.936984', '_unique_id': '13d629f5cfc0409aac868afd93b65c1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.937 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.947 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.947 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b816915-6eb7-4ede-9167-b48e16bf5037', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.938100', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b0eaf0-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': '8c22a90d70e7d22b9bda536f7c5f05f4c37618ab6a46ae920a439e529b1f8a91'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.938100', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b0f464-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': '6e6ea7d9091d00c795ed462174353edd6de8e796ddc726b474388785e2ac6054'}]}, 'timestamp': '2025-09-30 21:42:43.948335', '_unique_id': '30589363df6f4efa92c0c76f55cb3ce2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.949 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0edd59a-989d-49cc-998e-7c779abae51f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.950155', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b14be4-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': '9f39a29a6f9f5d99a3c141d33c0582c422f7104c1ed396cdf780795ac6993f86'}]}, 'timestamp': '2025-09-30 21:42:43.950394', '_unique_id': '034a434c67074bee8327d33700ed7690'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.951 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.951 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.allocation volume: 3936256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.951 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e86520a-1892-48a6-81d0-66ffe108cd8c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 3936256, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.951652', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b185c8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': 'f8574ba6a837ba2abec1e96a2419b7994cf37c8fb8d0ccf2a6e562fb491906d3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.951652', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b18dac-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': '7318ba6160cc285f07a8c25491e29c16a51d7f24f703b676d5c30e703ee011d8'}]}, 'timestamp': '2025-09-30 21:42:43.952061', '_unique_id': 'b48b1489c3d144a79bd55ff8f86c0afb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.952 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.953 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9286f90b-7bb6-472b-aaee-ea38a616f80d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.953286', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b1c862-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'c628846fc385cdbdc953218c5c4a65323338a0d4026a150ea36386a47463e12a'}]}, 'timestamp': '2025-09-30 21:42:43.953657', '_unique_id': '94a22975f8f541c581432b10c0c39300'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.954 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.955 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.969 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c579fb7d-ee75-4090-93cf-556c604560c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'timestamp': '2025-09-30T21:42:43.955162', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '65b438d6-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.656594974, 'message_signature': '04af0fb0a2c6bebc4b313e0d7c89ba61f8dd1acdc74b4ddbb98b78474eb07ab9'}]}, 'timestamp': '2025-09-30 21:42:43.969683', '_unique_id': 'c1de2d084cfd40cc97ce7670e702fe77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.971 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.971 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83c7fe94-2740-426d-94c9-6edd16042ab7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.971246', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b485fc-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'a2f91db1686af0a60b088d06c7bf22c892b70fa2ad019fded2d12c4146e897b9'}]}, 'timestamp': '2025-09-30 21:42:43.971543', '_unique_id': 'e6e2ad3bed9341e692a7de8efb561cba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.972 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8390762e-b4e2-4fd8-a789-292f5436c548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.972764', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b4c0a8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': '52f002e3225429e8cbdde4de7d9df2c02ad0eefc6f595c7ddacae1a8ce4c0053'}]}, 'timestamp': '2025-09-30 21:42:43.973069', '_unique_id': 'b07c099d4b8f40b98b926bd14c8953d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.973 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.974 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.974 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.bytes volume: 2424832 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.974 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba745023-bb0d-49c6-982c-25204dde6e18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2424832, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.974336', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b4fe7e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': '10177d56e0517a3d80ea9958cb181b91130c181107c228ba875cb04b6cd89273'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.974336', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b50950-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': '919ab584833654f43dfa7ae7a10d9537bd9d4c8681adbb52b23d6c7bf77d39e7'}]}, 'timestamp': '2025-09-30 21:42:43.974922', '_unique_id': 'af0fe5ce9c414a9cad08d62b99718b7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.976 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.976 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '194b8346-d2dd-4cf5-a2c5-988dfdb767c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.976508', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b55432-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'f4f8a75a8a5aadb734335c4e71ebaa165d4b134b683664980aa1d45e79927ec1'}]}, 'timestamp': '2025-09-30 21:42:43.976937', '_unique_id': '23b80b505086412890ef1662e8ca39b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.977 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>]
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.978 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44d136bc-8385-4914-b089-c423ca8b784a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.978631', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b5a3ec-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': '40d2ca1728a0ae401efe13ba78f55534eef3c9662b6932291c71a444f75075d7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.978631', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b5aba8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'a4a8203747e985269ae96c31579b6b501f8ba341f0d2f50509e0fb50afe4bbdb'}]}, 'timestamp': '2025-09-30 21:42:43.979075', '_unique_id': '1dceb9650259478b99e08b2b17efcd55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.980 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.980 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>]
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.980 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.980 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc6a2876-996d-4d5e-a875-b5a43d46e7d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.980831', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b5fbbc-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'eaf41fcf0753d31f6c52a9d1246e954495c80b246887a29ce156517651adfa92'}]}, 'timestamp': '2025-09-30 21:42:43.981107', '_unique_id': '04f10eec439a46e8a20d75b6709e6650'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.981 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.982 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.982 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.usage volume: 3473408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.982 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77ebccc7-789f-4c41-9fe5-8e9e4ab1505e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 3473408, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.982233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b63046-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': '5a25b6e8bb4f62360d6b47f84d5e756f6f36e3a340155bd00dbe90a7a2316866'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.982233', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b637b2-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.625605104, 'message_signature': 'd82545b50ce2f533a9e9a56ee2d81fadca02a8f94f7052cce2ccc690ef3cdde8'}]}, 'timestamp': '2025-09-30 21:42:43.982643', '_unique_id': '0b348af9c5d74b31938dca71eefef01c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.983 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/cpu volume: 10040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fb1287d-891b-47c7-8594-c0b78d2de56e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10040000000, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'timestamp': '2025-09-30T21:42:43.983849', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '65b66f84-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.656594974, 'message_signature': '26a2fe0293fe6f65dd6037e0b44e2e3a95c986667dd758979553f5c91d1d3ff8'}]}, 'timestamp': '2025-09-30 21:42:43.984062', '_unique_id': '8a9a1ab1205840bfb7344522ebb33534'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.984 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.985 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.985 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.latency volume: 553737664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.985 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.latency volume: 34591308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd34ae24a-a9c6-4e54-b011-805d93d385f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 553737664, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.985261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b6a6a2-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'e78dd79497d8675637272189c942c56d18977658196ef0b81f26b0d389781628'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 34591308, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.985261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b6b034-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': '45a4177aa7782ef71de11cdbfba4bd66c01799e1606c9c10b0c734376381b1fb'}]}, 'timestamp': '2025-09-30 21:42:43.985716', '_unique_id': 'ac0d388b8ebb4da69765ec880eda2b54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.986 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22d599a6-aa32-4cf1-a08d-59f1d58f88e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.986866', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b6e55e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': '1b61ddea2877f9adb7c7efdbf33ded369d011da696462d9da7e4089d82a897d9'}]}, 'timestamp': '2025-09-30 21:42:43.987096', '_unique_id': 'bdbb16ef9a744b0da0fe4fbe4113df22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.987 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.988 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.988 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.988 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-570822479>]
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.988 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.988 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6907f9af-4f5b-4f6d-9d9c-816a88b02792', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': 'instance-0000008b-3fab2833-fa74-4eb1-bef8-aa51780aa0f4-tap8e89083c-85', 'timestamp': '2025-09-30T21:42:43.988612', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'tap8e89083c-85', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:33:32:85', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap8e89083c-85'}, 'message_id': '65b729ba-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.59897822, 'message_signature': 'dcde655abe89a69f17306fa4c1ab79092c4a83e64e91a120a2bf7821da9c4090'}]}, 'timestamp': '2025-09-30 21:42:43.988837', '_unique_id': '7a98c6b839a94b0ba8f141765ad740e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.989 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 DEBUG ceilometer.compute.pollsters [-] 3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f692927-64b2-4fa4-980c-a0802d9f3951', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-vda', 'timestamp': '2025-09-30T21:42:43.990123', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '65b7647a-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'a1cb92769d7a0aed253e7d8eae1b049a709139bcb9a559994c2c3d2c0f3bc563'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_name': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_name': None, 'resource_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4-sda', 'timestamp': '2025-09-30T21:42:43.990123', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-570822479', 'name': 'instance-0000008b', 'instance_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'instance_type': 'm1.nano', 'host': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '65b76bfa-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5259.604716421, 'message_signature': 'e8bc53ba812ecf5071d35f04e2c58f9fe43b248f7f3b6a6a40893d0bf5c7c6bf'}]}, 'timestamp': '2025-09-30 21:42:43.990518', '_unique_id': '396d7dbd7cc14f47b3aeb70710211682'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:42:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:42:43.990 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:42:44 compute-0 nova_compute[192810]: 2025-09-30 21:42:44.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:45 compute-0 nova_compute[192810]: 2025-09-30 21:42:45.811 2 DEBUG nova.network.neutron [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updated VIF entry in instance network info cache for port 8e89083c-856f-40a4-b47f-444adcedd301. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:42:45 compute-0 nova_compute[192810]: 2025-09-30 21:42:45.812 2 DEBUG nova.network.neutron [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:42:45 compute-0 nova_compute[192810]: 2025-09-30 21:42:45.862 2 DEBUG oslo_concurrency.lockutils [req-c61c41d7-172e-41e2-9f6d-3ed457ad8d0e req-43fdcbca-6f5d-48d2-a2d6-94a4b111062c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:42:46 compute-0 podman[241625]: 2025-09-30 21:42:46.352586905 +0000 UTC m=+0.086693848 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:42:46 compute-0 podman[241626]: 2025-09-30 21:42:46.377526658 +0000 UTC m=+0.111182260 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Sep 30 21:42:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:42:47.693 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:42:47 compute-0 nova_compute[192810]: 2025-09-30 21:42:47.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:49 compute-0 nova_compute[192810]: 2025-09-30 21:42:49.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:50 compute-0 ovn_controller[94912]: 2025-09-30T21:42:50Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:32:85 10.100.0.13
Sep 30 21:42:50 compute-0 ovn_controller[94912]: 2025-09-30T21:42:50Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:32:85 10.100.0.13
Sep 30 21:42:52 compute-0 nova_compute[192810]: 2025-09-30 21:42:52.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:54 compute-0 nova_compute[192810]: 2025-09-30 21:42:54.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:54 compute-0 podman[241675]: 2025-09-30 21:42:54.328254916 +0000 UTC m=+0.058555238 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923)
Sep 30 21:42:54 compute-0 podman[241676]: 2025-09-30 21:42:54.342624239 +0000 UTC m=+0.061969612 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:42:54 compute-0 podman[241674]: 2025-09-30 21:42:54.343169802 +0000 UTC m=+0.075777590 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:42:54 compute-0 nova_compute[192810]: 2025-09-30 21:42:54.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:54 compute-0 nova_compute[192810]: 2025-09-30 21:42:54.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:56 compute-0 nova_compute[192810]: 2025-09-30 21:42:56.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:56 compute-0 nova_compute[192810]: 2025-09-30 21:42:56.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.823 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.907 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.977 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:57 compute-0 nova_compute[192810]: 2025-09-30 21:42:57.979 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.041 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.211 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.213 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5557MB free_disk=73.22082901000977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.213 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.214 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.315 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.315 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.315 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.353 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.370 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.398 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:42:58 compute-0 nova_compute[192810]: 2025-09-30 21:42:58.399 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:42:59 compute-0 nova_compute[192810]: 2025-09-30 21:42:59.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:00 compute-0 nova_compute[192810]: 2025-09-30 21:43:00.394 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:00 compute-0 nova_compute[192810]: 2025-09-30 21:43:00.395 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:01 compute-0 nova_compute[192810]: 2025-09-30 21:43:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:01 compute-0 nova_compute[192810]: 2025-09-30 21:43:01.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:43:01 compute-0 nova_compute[192810]: 2025-09-30 21:43:01.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.066 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.067 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.083 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.083 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.084 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.084 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.101 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.237 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.238 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.244 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.244 2 INFO nova.compute.claims [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.464 2 DEBUG nova.compute.provider_tree [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.485 2 DEBUG nova.scheduler.client.report [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.522 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.523 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:43:02 compute-0 nova_compute[192810]: 2025-09-30 21:43:02.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.378 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.379 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.439 2 INFO nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.518 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.558 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.559 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.625 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.872 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.874 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.874 2 INFO nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Creating image(s)
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.875 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.875 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.876 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.889 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.912 2 DEBUG nova.policy [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bedde839dd1d47ea82605be90cfad439', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb00322366384a558666220d5bb1ea75', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.944 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.945 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.945 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.955 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.972 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.973 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.980 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.981 2 INFO nova.compute.claims [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:43:04 compute-0 nova_compute[192810]: 2025-09-30 21:43:04.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.016 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.017 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.258 2 DEBUG nova.compute.provider_tree [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.289 2 DEBUG nova.scheduler.client.report [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.310 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.311 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.388 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk 1073741824" returned: 0 in 0.371s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.389 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.389 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.408 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.408 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.440 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.441 2 DEBUG nova.virt.disk.api [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Checking if we can resize image /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.441 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.457 2 INFO nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.481 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.502 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.503 2 DEBUG nova.virt.disk.api [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Cannot resize image /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.503 2 DEBUG nova.objects.instance [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lazy-loading 'migration_context' on Instance uuid 14871acd-831a-4e81-b1b0-8d1d415e1e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.525 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.526 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Ensure instance console log exists: /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.526 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.527 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.527 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.548 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [{"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.584 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.585 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.586 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.586 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.686 2 DEBUG nova.policy [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.700 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.702 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.702 2 INFO nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Creating image(s)
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.703 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.703 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.704 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.721 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.777 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.778 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.779 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.790 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.845 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:05 compute-0 nova_compute[192810]: 2025-09-30 21:43:05.847 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.123 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk 1073741824" returned: 0 in 0.276s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.124 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.124 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.176 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.178 2 DEBUG nova.virt.disk.api [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.178 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.239 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.240 2 DEBUG nova.virt.disk.api [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:43:06 compute-0 nova_compute[192810]: 2025-09-30 21:43:06.241 2 DEBUG nova.objects.instance [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 8b0c810b-62f3-42b5-802e-cff3319a31be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.249 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.250 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Ensure instance console log exists: /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.250 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.250 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.251 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.364 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Successfully created port: ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.455 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Successfully created port: f310064e-5720-442a-ac22-71c932096d05 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:43:07 compute-0 nova_compute[192810]: 2025-09-30 21:43:07.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:08 compute-0 podman[241776]: 2025-09-30 21:43:08.322254303 +0000 UTC m=+0.045943869 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:43:08 compute-0 podman[241775]: 2025-09-30 21:43:08.348943598 +0000 UTC m=+0.077128074 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:43:08 compute-0 podman[241777]: 2025-09-30 21:43:08.351832269 +0000 UTC m=+0.074538380 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:43:08 compute-0 nova_compute[192810]: 2025-09-30 21:43:08.975 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Successfully updated port: ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.491 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.491 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.491 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.568 2 DEBUG nova.compute.manager [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.569 2 DEBUG nova.compute.manager [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing instance network info cache due to event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.569 2 DEBUG oslo_concurrency.lockutils [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:09 compute-0 nova_compute[192810]: 2025-09-30 21:43:09.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:10 compute-0 nova_compute[192810]: 2025-09-30 21:43:10.389 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.122 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Successfully updated port: f310064e-5720-442a-ac22-71c932096d05 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.218 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.218 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquired lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.218 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.532 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.748 2 DEBUG nova.network.neutron [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.940 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.940 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Instance network_info: |[{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.941 2 DEBUG oslo_concurrency.lockutils [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.941 2 DEBUG nova.network.neutron [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.944 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Start _get_guest_xml network_info=[{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.948 2 WARNING nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.952 2 DEBUG nova.virt.libvirt.host [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.953 2 DEBUG nova.virt.libvirt.host [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.958 2 DEBUG nova.virt.libvirt.host [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.959 2 DEBUG nova.virt.libvirt.host [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.960 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.960 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.961 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.961 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.961 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.961 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.962 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.962 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.962 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.963 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.963 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.963 2 DEBUG nova.virt.hardware [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.968 2 DEBUG nova.virt.libvirt.vif [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:43:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1671875229',display_name='tempest-TestGettingAddress-server-1671875229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1671875229',id=143,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-7sg94hy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:43:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=8b0c810b-62f3-42b5-802e-cff3319a31be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.968 2 DEBUG nova.network.os_vif_util [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.970 2 DEBUG nova.network.os_vif_util [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:11 compute-0 nova_compute[192810]: 2025-09-30 21:43:11.971 2 DEBUG nova.objects.instance [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b0c810b-62f3-42b5-802e-cff3319a31be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.030 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <uuid>8b0c810b-62f3-42b5-802e-cff3319a31be</uuid>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <name>instance-0000008f</name>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-1671875229</nova:name>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:43:11</nova:creationTime>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         <nova:port uuid="ceed5eec-1f8b-4201-9a2e-d23353d1ff54">
Sep 30 21:43:12 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe88:a960" ipVersion="6"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <system>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="serial">8b0c810b-62f3-42b5-802e-cff3319a31be</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="uuid">8b0c810b-62f3-42b5-802e-cff3319a31be</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </system>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <os>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </os>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <features>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </features>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.config"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:88:a9:60"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <target dev="tapceed5eec-1f"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/console.log" append="off"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <video>
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </video>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:43:12 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:43:12 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:43:12 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:43:12 compute-0 nova_compute[192810]: </domain>
Sep 30 21:43:12 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.031 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Preparing to wait for external event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.031 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.031 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.032 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.032 2 DEBUG nova.virt.libvirt.vif [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:43:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1671875229',display_name='tempest-TestGettingAddress-server-1671875229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1671875229',id=143,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-7sg94hy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:43:05Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=8b0c810b-62f3-42b5-802e-cff3319a31be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.032 2 DEBUG nova.network.os_vif_util [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.033 2 DEBUG nova.network.os_vif_util [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.033 2 DEBUG os_vif [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapceed5eec-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.038 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapceed5eec-1f, col_values=(('external_ids', {'iface-id': 'ceed5eec-1f8b-4201-9a2e-d23353d1ff54', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:a9:60', 'vm-uuid': '8b0c810b-62f3-42b5-802e-cff3319a31be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:12 compute-0 NetworkManager[51733]: <info>  [1759268592.0401] manager: (tapceed5eec-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.050 2 INFO os_vif [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f')
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.623 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.624 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.624 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:88:a9:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.624 2 INFO nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Using config drive
Sep 30 21:43:12 compute-0 nova_compute[192810]: 2025-09-30 21:43:12.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.529 2 INFO nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Creating config drive at /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.config
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.539 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63cioly5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.572 2 DEBUG nova.compute.manager [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-changed-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.573 2 DEBUG nova.compute.manager [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing instance network info cache due to event network-changed-f310064e-5720-442a-ac22-71c932096d05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.574 2 DEBUG oslo_concurrency.lockutils [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.667 2 DEBUG oslo_concurrency.processutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63cioly5" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:13 compute-0 kernel: tapceed5eec-1f: entered promiscuous mode
Sep 30 21:43:13 compute-0 NetworkManager[51733]: <info>  [1759268593.7221] manager: (tapceed5eec-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/232)
Sep 30 21:43:13 compute-0 ovn_controller[94912]: 2025-09-30T21:43:13Z|00521|binding|INFO|Claiming lport ceed5eec-1f8b-4201-9a2e-d23353d1ff54 for this chassis.
Sep 30 21:43:13 compute-0 ovn_controller[94912]: 2025-09-30T21:43:13Z|00522|binding|INFO|ceed5eec-1f8b-4201-9a2e-d23353d1ff54: Claiming fa:16:3e:88:a9:60 10.100.0.11 2001:db8::f816:3eff:fe88:a960
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 ovn_controller[94912]: 2025-09-30T21:43:13Z|00523|binding|INFO|Setting lport ceed5eec-1f8b-4201-9a2e-d23353d1ff54 ovn-installed in OVS
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.743 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:a9:60 10.100.0.11 2001:db8::f816:3eff:fe88:a960'], port_security=['fa:16:3e:88:a9:60 10.100.0.11 2001:db8::f816:3eff:fe88:a960'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe88:a960/64', 'neutron:device_id': '8b0c810b-62f3-42b5-802e-cff3319a31be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8133a1ec-543c-40bd-a8db-7ecfb4b9f0ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ceed5eec-1f8b-4201-9a2e-d23353d1ff54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.744 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ceed5eec-1f8b-4201-9a2e-d23353d1ff54 in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb bound to our chassis
Sep 30 21:43:13 compute-0 ovn_controller[94912]: 2025-09-30T21:43:13Z|00524|binding|INFO|Setting lport ceed5eec-1f8b-4201-9a2e-d23353d1ff54 up in Southbound
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.746 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb
Sep 30 21:43:13 compute-0 systemd-udevd[241853]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:43:13 compute-0 systemd-machined[152794]: New machine qemu-69-instance-0000008f.
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.761 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[03ac976c-3e5a-4e3f-9315-93fe051bde04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 NetworkManager[51733]: <info>  [1759268593.7672] device (tapceed5eec-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:43:13 compute-0 NetworkManager[51733]: <info>  [1759268593.7680] device (tapceed5eec-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:43:13 compute-0 systemd[1]: Started Virtual Machine qemu-69-instance-0000008f.
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.790 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6cd995-da75-415a-9712-9afddeb65554]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.794 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf7d90d-68c5-47b5-a3c3-36fd6a5eb6fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.821 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6b7f0b1e-3ed8-4393-8ffc-6c2b2b7712b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.838 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[95b0fe4b-7f19-403a-aded-46baa866c783]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0670a52f-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:2b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 22, 'tx_packets': 5, 'rx_bytes': 1956, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 22, 'tx_packets': 5, 'rx_bytes': 1956, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524702, 'reachable_time': 22070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 20, 'inoctets': 1592, 'indelivers': 4, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 20, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1592, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 20, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 4, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241866, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[310d1b91-ac1b-4146-93da-0b1d9c5c6ba7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0670a52f-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524715, 'tstamp': 524715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241868, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0670a52f-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524718, 'tstamp': 524718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241868, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.859 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0670a52f-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 nova_compute[192810]: 2025-09-30 21:43:13.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.862 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0670a52f-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.862 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.862 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0670a52f-e0, col_values=(('external_ids', {'iface-id': '5f2fed01-3e8a-4c9d-b394-f15b7f60366f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:13 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:13.863 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.535 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268594.5348241, 8b0c810b-62f3-42b5-802e-cff3319a31be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.536 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] VM Started (Lifecycle Event)
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.616 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.620 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268594.5354266, 8b0c810b-62f3-42b5-802e-cff3319a31be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.620 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] VM Paused (Lifecycle Event)
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.693 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.697 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.812 2 DEBUG nova.compute.manager [req-aeb41d25-d803-4b7f-b847-8787d233b830 req-4ec7c691-7187-4464-88b3-5dcfd9b19645 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.812 2 DEBUG oslo_concurrency.lockutils [req-aeb41d25-d803-4b7f-b847-8787d233b830 req-4ec7c691-7187-4464-88b3-5dcfd9b19645 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.813 2 DEBUG oslo_concurrency.lockutils [req-aeb41d25-d803-4b7f-b847-8787d233b830 req-4ec7c691-7187-4464-88b3-5dcfd9b19645 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.814 2 DEBUG oslo_concurrency.lockutils [req-aeb41d25-d803-4b7f-b847-8787d233b830 req-4ec7c691-7187-4464-88b3-5dcfd9b19645 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.814 2 DEBUG nova.compute.manager [req-aeb41d25-d803-4b7f-b847-8787d233b830 req-4ec7c691-7187-4464-88b3-5dcfd9b19645 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Processing event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.815 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.818 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.821 2 INFO nova.virt.libvirt.driver [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Instance spawned successfully.
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.821 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.848 2 DEBUG nova.network.neutron [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.957 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.957 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268594.8175964, 8b0c810b-62f3-42b5-802e-cff3319a31be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.957 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] VM Resumed (Lifecycle Event)
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.962 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.962 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.963 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.963 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.964 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:14 compute-0 nova_compute[192810]: 2025-09-30 21:43:14.964 2 DEBUG nova.virt.libvirt.driver [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.067 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.071 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.074 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Releasing lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.075 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Instance network_info: |[{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.075 2 DEBUG oslo_concurrency.lockutils [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.075 2 DEBUG nova.network.neutron [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing network info cache for port f310064e-5720-442a-ac22-71c932096d05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.078 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Start _get_guest_xml network_info=[{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.081 2 WARNING nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.086 2 DEBUG nova.virt.libvirt.host [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.087 2 DEBUG nova.virt.libvirt.host [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.090 2 DEBUG nova.virt.libvirt.host [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.091 2 DEBUG nova.virt.libvirt.host [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.092 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.092 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.092 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.092 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.093 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.093 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.093 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.093 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.093 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.094 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.094 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.094 2 DEBUG nova.virt.hardware [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.097 2 DEBUG nova.virt.libvirt.vif [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:43:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-576292732-acc',id=142,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBa6RzlrZwzSQnYLMN5Uo66teCmEJk7j4fZqyIMs+H533SRijA8neeKUXKPPZd+DpplmTv8NjCrX68vJTURnSF9evOhuIsCkh3sge9M7z96jFRjg9S/vVrSmi9YklDKQHA==',key_name='tempest-TestSecurityGroupsBasicOps-1887830207',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb00322366384a558666220d5bb1ea75',ramdisk_id='',reservation_id='r-nd4s85ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-576292732',owner_user_name='tempest-TestSecurityGroupsBasicOps-576292732-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:43:04Z,user_data=None,user_id='bedde839dd1d47ea82605be90cfad439',uuid=14871acd-831a-4e81-b1b0-8d1d415e1e62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.097 2 DEBUG nova.network.os_vif_util [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converting VIF {"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.098 2 DEBUG nova.network.os_vif_util [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.099 2 DEBUG nova.objects.instance [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 14871acd-831a-4e81-b1b0-8d1d415e1e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.146 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.148 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <uuid>14871acd-831a-4e81-b1b0-8d1d415e1e62</uuid>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <name>instance-0000008e</name>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764</nova:name>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:43:15</nova:creationTime>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:user uuid="bedde839dd1d47ea82605be90cfad439">tempest-TestSecurityGroupsBasicOps-576292732-project-member</nova:user>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:project uuid="cb00322366384a558666220d5bb1ea75">tempest-TestSecurityGroupsBasicOps-576292732</nova:project>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         <nova:port uuid="f310064e-5720-442a-ac22-71c932096d05">
Sep 30 21:43:15 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <system>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="serial">14871acd-831a-4e81-b1b0-8d1d415e1e62</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="uuid">14871acd-831a-4e81-b1b0-8d1d415e1e62</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </system>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <os>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </os>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <features>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </features>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.config"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:7a:5a:ab"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <target dev="tapf310064e-57"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/console.log" append="off"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <video>
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </video>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:43:15 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:43:15 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:43:15 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:43:15 compute-0 nova_compute[192810]: </domain>
Sep 30 21:43:15 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.149 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Preparing to wait for external event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.149 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.149 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.149 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.150 2 DEBUG nova.virt.libvirt.vif [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:43:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-576292732-acc',id=142,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBa6RzlrZwzSQnYLMN5Uo66teCmEJk7j4fZqyIMs+H533SRijA8neeKUXKPPZd+DpplmTv8NjCrX68vJTURnSF9evOhuIsCkh3sge9M7z96jFRjg9S/vVrSmi9YklDKQHA==',key_name='tempest-TestSecurityGroupsBasicOps-1887830207',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb00322366384a558666220d5bb1ea75',ramdisk_id='',reservation_id='r-nd4s85ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-576292732',owner_user_name='tempest-TestSecurityGroupsBasicOps-576292732-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:43:04Z,user_data=None,user_id='bedde839dd1d47ea82605be90cfad439',uuid=14871acd-831a-4e81-b1b0-8d1d415e1e62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.150 2 DEBUG nova.network.os_vif_util [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converting VIF {"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.151 2 DEBUG nova.network.os_vif_util [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.151 2 DEBUG os_vif [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.152 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.153 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf310064e-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf310064e-57, col_values=(('external_ids', {'iface-id': 'f310064e-5720-442a-ac22-71c932096d05', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:5a:ab', 'vm-uuid': '14871acd-831a-4e81-b1b0-8d1d415e1e62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:15 compute-0 NetworkManager[51733]: <info>  [1759268595.1604] manager: (tapf310064e-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.166 2 INFO os_vif [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57')
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.280 2 INFO nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Took 9.58 seconds to spawn the instance on the hypervisor.
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.281 2 DEBUG nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.620 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.621 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.621 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] No VIF found with MAC fa:16:3e:7a:5a:ab, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.622 2 INFO nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Using config drive
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.654 2 DEBUG nova.network.neutron [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updated VIF entry in instance network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.655 2 DEBUG nova.network.neutron [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.756 2 DEBUG oslo_concurrency.lockutils [req-274e07f6-508d-4c82-a45b-72b65a3cf743 req-aa7316bb-6428-433e-95d4-1389b1db9575 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.841 2 INFO nova.compute.manager [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Took 10.98 seconds to build instance.
Sep 30 21:43:15 compute-0 nova_compute[192810]: 2025-09-30 21:43:15.962 2 DEBUG oslo_concurrency.lockutils [None req-1c6fedfd-5a51-4692-917f-69dd22b40520 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.613 2 INFO nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Creating config drive at /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.config
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.618 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgwmkt0tg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.744 2 DEBUG oslo_concurrency.processutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgwmkt0tg" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:43:16 compute-0 kernel: tapf310064e-57: entered promiscuous mode
Sep 30 21:43:16 compute-0 NetworkManager[51733]: <info>  [1759268596.8133] manager: (tapf310064e-57): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Sep 30 21:43:16 compute-0 ovn_controller[94912]: 2025-09-30T21:43:16Z|00525|binding|INFO|Claiming lport f310064e-5720-442a-ac22-71c932096d05 for this chassis.
Sep 30 21:43:16 compute-0 ovn_controller[94912]: 2025-09-30T21:43:16Z|00526|binding|INFO|f310064e-5720-442a-ac22-71c932096d05: Claiming fa:16:3e:7a:5a:ab 10.100.0.3
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.828 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:5a:ab 10.100.0.3'], port_security=['fa:16:3e:7a:5a:ab 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '14871acd-831a-4e81-b1b0-8d1d415e1e62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e174820-c17e-4913-80eb-3b337af24ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb00322366384a558666220d5bb1ea75', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15edc11d-fe1e-4ba6-bc6e-3ba35452f3a6 4e9360f7-9bdc-455b-b393-ad2ca9806660', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6550d43c-4039-45a9-a6b8-639d2fdf2b9e, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f310064e-5720-442a-ac22-71c932096d05) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.829 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f310064e-5720-442a-ac22-71c932096d05 in datapath 9e174820-c17e-4913-80eb-3b337af24ff8 bound to our chassis
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.830 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e174820-c17e-4913-80eb-3b337af24ff8
Sep 30 21:43:16 compute-0 ovn_controller[94912]: 2025-09-30T21:43:16Z|00527|binding|INFO|Setting lport f310064e-5720-442a-ac22-71c932096d05 ovn-installed in OVS
Sep 30 21:43:16 compute-0 ovn_controller[94912]: 2025-09-30T21:43:16Z|00528|binding|INFO|Setting lport f310064e-5720-442a-ac22-71c932096d05 up in Southbound
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.842 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2b268b-3e07-4811-a16c-b2642420a486]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.844 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9e174820-c1 in ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.846 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9e174820-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.846 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a70b2dcd-8c53-4afc-ba4d-cc650a8f5c9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.847 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[61a0844f-2004-4597-887e-e80fcb82a5e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 systemd-udevd[241924]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.857 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[e54e9a7b-9bfc-42a7-bbb3-90879c7c44e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 NetworkManager[51733]: <info>  [1759268596.8688] device (tapf310064e-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:43:16 compute-0 systemd-machined[152794]: New machine qemu-70-instance-0000008e.
Sep 30 21:43:16 compute-0 NetworkManager[51733]: <info>  [1759268596.8699] device (tapf310064e-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.873 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[39cadb46-392e-4477-b284-6ab417ffa4be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 systemd[1]: Started Virtual Machine qemu-70-instance-0000008e.
Sep 30 21:43:16 compute-0 podman[241889]: 2025-09-30 21:43:16.881513677 +0000 UTC m=+0.072773607 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:43:16 compute-0 podman[241890]: 2025-09-30 21:43:16.890983179 +0000 UTC m=+0.082173467 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.903 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dabe2739-19ba-470e-90f8-fa3f53b63969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.907 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8c0d9960-7e6f-4744-a93f-51a87777821d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 NetworkManager[51733]: <info>  [1759268596.9090] manager: (tap9e174820-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.937 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d168bcc9-c8a6-4719-888e-5364423fd7a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.941 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d2d08b-2eb4-4109-a6fb-25adaa3a3f3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.947 2 DEBUG nova.compute.manager [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.947 2 DEBUG oslo_concurrency.lockutils [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.947 2 DEBUG oslo_concurrency.lockutils [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.948 2 DEBUG oslo_concurrency.lockutils [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.948 2 DEBUG nova.compute.manager [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] No waiting events found dispatching network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:16 compute-0 nova_compute[192810]: 2025-09-30 21:43:16.948 2 WARNING nova.compute.manager [req-241e5577-cfc7-4548-bf98-5a116307df5d req-daddf496-3b96-46db-b2b9-68a764fced6b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received unexpected event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 for instance with vm_state active and task_state None.
Sep 30 21:43:16 compute-0 NetworkManager[51733]: <info>  [1759268596.9689] device (tap9e174820-c0): carrier: link connected
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.973 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0cf4f5-4118-4ab4-b3e0-4dcf17966100]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:16.989 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba32c14-0721-410a-9119-2bb947ab99d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e174820-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:af:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529259, 'reachable_time': 37202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241972, 'error': None, 'target': 'ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.004 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[140bbc61-04ab-4a59-ba8e-5e8fb2a5eaf9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3b:af3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529259, 'tstamp': 529259}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241973, 'error': None, 'target': 'ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.020 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6fba9b13-93a2-46c5-8389-e79fb35fa4a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e174820-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3b:af:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 160], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529259, 'reachable_time': 37202, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241974, 'error': None, 'target': 'ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.048 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef4fed0-51d3-4693-914d-2eac54b75403]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.109 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c2fe8919-220f-4bf0-9893-d8207ba6848d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.110 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e174820-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.111 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.111 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e174820-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:17 compute-0 NetworkManager[51733]: <info>  [1759268597.1137] manager: (tap9e174820-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Sep 30 21:43:17 compute-0 kernel: tap9e174820-c0: entered promiscuous mode
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.116 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e174820-c0, col_values=(('external_ids', {'iface-id': 'f88051d0-b158-4143-b8df-208af04344db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:17 compute-0 ovn_controller[94912]: 2025-09-30T21:43:17Z|00529|binding|INFO|Releasing lport f88051d0-b158-4143-b8df-208af04344db from this chassis (sb_readonly=0)
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.118 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9e174820-c17e-4913-80eb-3b337af24ff8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9e174820-c17e-4913-80eb-3b337af24ff8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.130 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[20978a82-8536-4ee8-9834-49f77695046e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.131 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-9e174820-c17e-4913-80eb-3b337af24ff8
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/9e174820-c17e-4913-80eb-3b337af24ff8.pid.haproxy
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 9e174820-c17e-4913-80eb-3b337af24ff8
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:17.133 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8', 'env', 'PROCESS_TAG=haproxy-9e174820-c17e-4913-80eb-3b337af24ff8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9e174820-c17e-4913-80eb-3b337af24ff8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:43:17 compute-0 podman[242013]: 2025-09-30 21:43:17.454741034 +0000 UTC m=+0.024422431 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.785 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268597.783941, 14871acd-831a-4e81-b1b0-8d1d415e1e62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.785 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] VM Started (Lifecycle Event)
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.813 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.819 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268597.7886267, 14871acd-831a-4e81-b1b0-8d1d415e1e62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.819 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] VM Paused (Lifecycle Event)
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.866 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.870 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.900 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.979 2 DEBUG nova.network.neutron [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updated VIF entry in instance network info cache for port f310064e-5720-442a-ac22-71c932096d05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:17 compute-0 nova_compute[192810]: 2025-09-30 21:43:17.979 2 DEBUG nova.network.neutron [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:18 compute-0 podman[242013]: 2025-09-30 21:43:18.017255019 +0000 UTC m=+0.586936416 container create 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.018 2 DEBUG oslo_concurrency.lockutils [req-24b4079b-89c1-4cd8-9ab3-3982e1117185 req-6fd62001-2ce5-4218-8106-f0840d7e1f6a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.085 2 DEBUG nova.compute.manager [req-10b6b89d-eec2-4e98-aaf1-42fad8913c72 req-33b9ec56-7f6f-4fa5-83b0-b1b6e65f3a23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.085 2 DEBUG oslo_concurrency.lockutils [req-10b6b89d-eec2-4e98-aaf1-42fad8913c72 req-33b9ec56-7f6f-4fa5-83b0-b1b6e65f3a23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.086 2 DEBUG oslo_concurrency.lockutils [req-10b6b89d-eec2-4e98-aaf1-42fad8913c72 req-33b9ec56-7f6f-4fa5-83b0-b1b6e65f3a23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.086 2 DEBUG oslo_concurrency.lockutils [req-10b6b89d-eec2-4e98-aaf1-42fad8913c72 req-33b9ec56-7f6f-4fa5-83b0-b1b6e65f3a23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.087 2 DEBUG nova.compute.manager [req-10b6b89d-eec2-4e98-aaf1-42fad8913c72 req-33b9ec56-7f6f-4fa5-83b0-b1b6e65f3a23 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Processing event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.087 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.091 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268598.0909266, 14871acd-831a-4e81-b1b0-8d1d415e1e62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.091 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] VM Resumed (Lifecycle Event)
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.094 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.098 2 INFO nova.virt.libvirt.driver [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Instance spawned successfully.
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.098 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.127 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.131 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.142 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.143 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.143 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.144 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.144 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.145 2 DEBUG nova.virt.libvirt.driver [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:43:18 compute-0 systemd[1]: Started libpod-conmon-3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c.scope.
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.181 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:43:18 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:43:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a8bdaf780ceeffbcc48c2de4ba0d5340414652cc692cbd080ba638a01f17a8b7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.312 2 INFO nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Took 13.44 seconds to spawn the instance on the hypervisor.
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.313 2 DEBUG nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:43:18 compute-0 podman[242013]: 2025-09-30 21:43:18.339442215 +0000 UTC m=+0.909123612 container init 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:43:18 compute-0 podman[242013]: 2025-09-30 21:43:18.346205861 +0000 UTC m=+0.915887258 container start 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:43:18 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [NOTICE]   (242032) : New worker (242034) forked
Sep 30 21:43:18 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [NOTICE]   (242032) : Loading success.
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.443 2 INFO nova.compute.manager [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Took 16.25 seconds to build instance.
Sep 30 21:43:18 compute-0 nova_compute[192810]: 2025-09-30 21:43:18.488 2 DEBUG oslo_concurrency.lockutils [None req-b98a96ee-7521-4f2e-8962-b8908762683e bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.297 2 DEBUG nova.compute.manager [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.298 2 DEBUG oslo_concurrency.lockutils [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.298 2 DEBUG oslo_concurrency.lockutils [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.299 2 DEBUG oslo_concurrency.lockutils [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.299 2 DEBUG nova.compute.manager [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] No waiting events found dispatching network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:20 compute-0 nova_compute[192810]: 2025-09-30 21:43:20.299 2 WARNING nova.compute.manager [req-032b80eb-1764-49cb-a508-bbc0076dba2e req-e186a857-a688-4f75-9c62-92a5556765d6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received unexpected event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 for instance with vm_state active and task_state None.
Sep 30 21:43:21 compute-0 nova_compute[192810]: 2025-09-30 21:43:21.041 2 DEBUG nova.compute.manager [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:21 compute-0 nova_compute[192810]: 2025-09-30 21:43:21.042 2 DEBUG nova.compute.manager [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing instance network info cache due to event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:21 compute-0 nova_compute[192810]: 2025-09-30 21:43:21.042 2 DEBUG oslo_concurrency.lockutils [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:21 compute-0 nova_compute[192810]: 2025-09-30 21:43:21.043 2 DEBUG oslo_concurrency.lockutils [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:21 compute-0 nova_compute[192810]: 2025-09-30 21:43:21.043 2 DEBUG nova.network.neutron [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:22 compute-0 nova_compute[192810]: 2025-09-30 21:43:22.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:24 compute-0 nova_compute[192810]: 2025-09-30 21:43:24.440 2 DEBUG nova.network.neutron [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updated VIF entry in instance network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:24 compute-0 nova_compute[192810]: 2025-09-30 21:43:24.441 2 DEBUG nova.network.neutron [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:24 compute-0 nova_compute[192810]: 2025-09-30 21:43:24.517 2 DEBUG oslo_concurrency.lockutils [req-7b25b9ad-689f-49dd-87b7-54ef82978be2 req-889b9c87-a191-4955-86d9-b8817e508e56 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:25 compute-0 nova_compute[192810]: 2025-09-30 21:43:25.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:25 compute-0 podman[242044]: 2025-09-30 21:43:25.363154494 +0000 UTC m=+0.089037906 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:43:25 compute-0 podman[242043]: 2025-09-30 21:43:25.365511942 +0000 UTC m=+0.091150678 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS)
Sep 30 21:43:25 compute-0 podman[242045]: 2025-09-30 21:43:25.374217886 +0000 UTC m=+0.084056094 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.543 2 DEBUG nova.compute.manager [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-changed-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.544 2 DEBUG nova.compute.manager [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing instance network info cache due to event network-changed-f310064e-5720-442a-ac22-71c932096d05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.544 2 DEBUG oslo_concurrency.lockutils [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.544 2 DEBUG oslo_concurrency.lockutils [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.545 2 DEBUG nova.network.neutron [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing network info cache for port f310064e-5720-442a-ac22-71c932096d05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:27 compute-0 nova_compute[192810]: 2025-09-30 21:43:27.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-0 nova_compute[192810]: 2025-09-30 21:43:30.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:30 compute-0 nova_compute[192810]: 2025-09-30 21:43:30.475 2 DEBUG nova.network.neutron [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updated VIF entry in instance network info cache for port f310064e-5720-442a-ac22-71c932096d05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:30 compute-0 nova_compute[192810]: 2025-09-30 21:43:30.476 2 DEBUG nova.network.neutron [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:30 compute-0 nova_compute[192810]: 2025-09-30 21:43:30.515 2 DEBUG oslo_concurrency.lockutils [req-60d70b93-7884-474b-bd25-fd6fe746f820 req-2980cdb4-4928-46bf-8ac2-f2cd8cf92854 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:32 compute-0 nova_compute[192810]: 2025-09-30 21:43:32.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:34 compute-0 ovn_controller[94912]: 2025-09-30T21:43:34Z|00530|binding|INFO|Releasing lport 5f2fed01-3e8a-4c9d-b394-f15b7f60366f from this chassis (sb_readonly=0)
Sep 30 21:43:34 compute-0 ovn_controller[94912]: 2025-09-30T21:43:34Z|00531|binding|INFO|Releasing lport f88051d0-b158-4143-b8df-208af04344db from this chassis (sb_readonly=0)
Sep 30 21:43:34 compute-0 nova_compute[192810]: 2025-09-30 21:43:34.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:35 compute-0 nova_compute[192810]: 2025-09-30 21:43:35.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:35.700 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:35 compute-0 nova_compute[192810]: 2025-09-30 21:43:35.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:35.701 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:43:36 compute-0 ovn_controller[94912]: 2025-09-30T21:43:36Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:a9:60 10.100.0.11
Sep 30 21:43:36 compute-0 ovn_controller[94912]: 2025-09-30T21:43:36Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:a9:60 10.100.0.11
Sep 30 21:43:37 compute-0 nova_compute[192810]: 2025-09-30 21:43:37.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:37 compute-0 ovn_controller[94912]: 2025-09-30T21:43:37Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:5a:ab 10.100.0.3
Sep 30 21:43:37 compute-0 ovn_controller[94912]: 2025-09-30T21:43:37Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:5a:ab 10.100.0.3
Sep 30 21:43:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:38.748 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:38.748 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:38.749 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:39 compute-0 podman[242141]: 2025-09-30 21:43:39.333747006 +0000 UTC m=+0.056996190 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:43:39 compute-0 podman[242142]: 2025-09-30 21:43:39.335176941 +0000 UTC m=+0.056587559 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:43:39 compute-0 podman[242140]: 2025-09-30 21:43:39.361362284 +0000 UTC m=+0.087699023 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Sep 30 21:43:39 compute-0 nova_compute[192810]: 2025-09-30 21:43:39.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:40 compute-0 nova_compute[192810]: 2025-09-30 21:43:40.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:42 compute-0 nova_compute[192810]: 2025-09-30 21:43:42.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:44.703 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:45 compute-0 nova_compute[192810]: 2025-09-30 21:43:45.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:45 compute-0 nova_compute[192810]: 2025-09-30 21:43:45.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:47 compute-0 podman[242204]: 2025-09-30 21:43:47.311431818 +0000 UTC m=+0.051751781 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:43:47 compute-0 podman[242205]: 2025-09-30 21:43:47.338468881 +0000 UTC m=+0.066823751 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 21:43:47 compute-0 nova_compute[192810]: 2025-09-30 21:43:47.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:50 compute-0 nova_compute[192810]: 2025-09-30 21:43:50.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:52 compute-0 nova_compute[192810]: 2025-09-30 21:43:52.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:54 compute-0 nova_compute[192810]: 2025-09-30 21:43:54.837 2 DEBUG nova.compute.manager [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:54 compute-0 nova_compute[192810]: 2025-09-30 21:43:54.838 2 DEBUG nova.compute.manager [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing instance network info cache due to event network-changed-ceed5eec-1f8b-4201-9a2e-d23353d1ff54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:43:54 compute-0 nova_compute[192810]: 2025-09-30 21:43:54.838 2 DEBUG oslo_concurrency.lockutils [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:43:54 compute-0 nova_compute[192810]: 2025-09-30 21:43:54.839 2 DEBUG oslo_concurrency.lockutils [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:43:54 compute-0 nova_compute[192810]: 2025-09-30 21:43:54.839 2 DEBUG nova.network.neutron [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Refreshing network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.181 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.182 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.182 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.183 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.183 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.371 2 INFO nova.compute.manager [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Terminating instance
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.630 2 DEBUG nova.compute.manager [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:43:55 compute-0 kernel: tapceed5eec-1f (unregistering): left promiscuous mode
Sep 30 21:43:55 compute-0 NetworkManager[51733]: <info>  [1759268635.6572] device (tapceed5eec-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 ovn_controller[94912]: 2025-09-30T21:43:55Z|00532|binding|INFO|Releasing lport ceed5eec-1f8b-4201-9a2e-d23353d1ff54 from this chassis (sb_readonly=0)
Sep 30 21:43:55 compute-0 ovn_controller[94912]: 2025-09-30T21:43:55Z|00533|binding|INFO|Setting lport ceed5eec-1f8b-4201-9a2e-d23353d1ff54 down in Southbound
Sep 30 21:43:55 compute-0 ovn_controller[94912]: 2025-09-30T21:43:55Z|00534|binding|INFO|Removing iface tapceed5eec-1f ovn-installed in OVS
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008f.scope: Deactivated successfully.
Sep 30 21:43:55 compute-0 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d0000008f.scope: Consumed 15.193s CPU time.
Sep 30 21:43:55 compute-0 systemd-machined[152794]: Machine qemu-69-instance-0000008f terminated.
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.730 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:a9:60 10.100.0.11 2001:db8::f816:3eff:fe88:a960'], port_security=['fa:16:3e:88:a9:60 10.100.0.11 2001:db8::f816:3eff:fe88:a960'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8::f816:3eff:fe88:a960/64', 'neutron:device_id': '8b0c810b-62f3-42b5-802e-cff3319a31be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8133a1ec-543c-40bd-a8db-7ecfb4b9f0ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ceed5eec-1f8b-4201-9a2e-d23353d1ff54) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.732 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ceed5eec-1f8b-4201-9a2e-d23353d1ff54 in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb unbound from our chassis
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.733 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb
Sep 30 21:43:55 compute-0 podman[242250]: 2025-09-30 21:43:55.735953554 +0000 UTC m=+0.057422980 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd)
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.746 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[86338d77-9984-465b-b90b-f1b69a0e0526]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 podman[242253]: 2025-09-30 21:43:55.756306293 +0000 UTC m=+0.055173775 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:43:55 compute-0 podman[242252]: 2025-09-30 21:43:55.759740728 +0000 UTC m=+0.076376826 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.772 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[30c42021-4fc0-4ddb-833d-8a1dcb69c80a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.774 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d36788-d1e4-4d62-a6e5-3770305d9abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.799 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[5c0bde5c-af87-4cea-a9f8-d60869f17244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.814 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e2984479-3fad-41d5-95d8-d2910ac10875]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0670a52f-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:2b:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 7, 'rx_bytes': 3080, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 7, 'rx_bytes': 3080, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 157], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524702, 'reachable_time': 22070, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 32, 'inoctets': 2464, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 32, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2464, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 32, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242322, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.830 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e51e6d-5890-4f0c-895b-bbd11cab7af4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0670a52f-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524715, 'tstamp': 524715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242323, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0670a52f-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 524718, 'tstamp': 524718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242323, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.831 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0670a52f-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.838 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0670a52f-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.838 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.839 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0670a52f-e0, col_values=(('external_ids', {'iface-id': '5f2fed01-3e8a-4c9d-b394-f15b7f60366f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:55.839 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.894 2 INFO nova.virt.libvirt.driver [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Instance destroyed successfully.
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.895 2 DEBUG nova.objects.instance [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 8b0c810b-62f3-42b5-802e-cff3319a31be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.990 2 DEBUG nova.virt.libvirt.vif [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:43:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1671875229',display_name='tempest-TestGettingAddress-server-1671875229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1671875229',id=143,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:43:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-7sg94hy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:43:15Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=8b0c810b-62f3-42b5-802e-cff3319a31be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.991 2 DEBUG nova.network.os_vif_util [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.991 2 DEBUG nova.network.os_vif_util [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.992 2 DEBUG os_vif [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.994 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapceed5eec-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.999 2 INFO os_vif [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:a9:60,bridge_name='br-int',has_traffic_filtering=True,id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapceed5eec-1f')
Sep 30 21:43:55 compute-0 nova_compute[192810]: 2025-09-30 21:43:55.999 2 INFO nova.virt.libvirt.driver [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Deleting instance files /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be_del
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.000 2 INFO nova.virt.libvirt.driver [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Deletion of /var/lib/nova/instances/8b0c810b-62f3-42b5-802e-cff3319a31be_del complete
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.089 2 DEBUG nova.compute.manager [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-unplugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.089 2 DEBUG oslo_concurrency.lockutils [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.089 2 DEBUG oslo_concurrency.lockutils [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.090 2 DEBUG oslo_concurrency.lockutils [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.090 2 DEBUG nova.compute.manager [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] No waiting events found dispatching network-vif-unplugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.090 2 DEBUG nova.compute.manager [req-08e1c8c1-de1a-4476-afa4-6c1b511b4943 req-81fc9b40-08a9-4e86-a7f5-6176f003db59 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-unplugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.893 2 INFO nova.compute.manager [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Took 1.26 seconds to destroy the instance on the hypervisor.
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.894 2 DEBUG oslo.service.loopingcall [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.894 2 DEBUG nova.compute.manager [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:43:56 compute-0 nova_compute[192810]: 2025-09-30 21:43:56.894 2 DEBUG nova.network.neutron [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.047 2 DEBUG nova.network.neutron [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updated VIF entry in instance network info cache for port ceed5eec-1f8b-4201-9a2e-d23353d1ff54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.048 2 DEBUG nova.network.neutron [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [{"id": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "address": "fa:16:3e:88:a9:60", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe88:a960", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapceed5eec-1f", "ovs_interfaceid": "ceed5eec-1f8b-4201-9a2e-d23353d1ff54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.121 2 DEBUG oslo_concurrency.lockutils [req-3b356259-1331-4071-ade1-0ef696d50eb8 req-935c5f07-bb46-420d-b04f-89e40a7d7bac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-8b0c810b-62f3-42b5-802e-cff3319a31be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.903 2 DEBUG nova.network.neutron [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.945 2 DEBUG nova.compute.manager [req-8b29cd05-e4ab-4243-a840-a5fcdba097de req-27e207fb-a120-45b6-a17d-7a76f5cccc31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-deleted-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.946 2 INFO nova.compute.manager [req-8b29cd05-e4ab-4243-a840-a5fcdba097de req-27e207fb-a120-45b6-a17d-7a76f5cccc31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Neutron deleted interface ceed5eec-1f8b-4201-9a2e-d23353d1ff54; detaching it from the instance and deleting it from the info cache
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.946 2 DEBUG nova.network.neutron [req-8b29cd05-e4ab-4243-a840-a5fcdba097de req-27e207fb-a120-45b6-a17d-7a76f5cccc31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.970 2 INFO nova.compute.manager [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Took 1.08 seconds to deallocate network for instance.
Sep 30 21:43:57 compute-0 nova_compute[192810]: 2025-09-30 21:43:57.973 2 DEBUG nova.compute.manager [req-8b29cd05-e4ab-4243-a840-a5fcdba097de req-27e207fb-a120-45b6-a17d-7a76f5cccc31 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Detach interface failed, port_id=ceed5eec-1f8b-4201-9a2e-d23353d1ff54, reason: Instance 8b0c810b-62f3-42b5-802e-cff3319a31be could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.196 2 DEBUG nova.compute.manager [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.198 2 DEBUG oslo_concurrency.lockutils [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.198 2 DEBUG oslo_concurrency.lockutils [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.199 2 DEBUG oslo_concurrency.lockutils [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.199 2 DEBUG nova.compute.manager [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] No waiting events found dispatching network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.199 2 WARNING nova.compute.manager [req-d1a93781-a5d1-49e6-9649-55eafb94e319 req-ea51b9cb-420f-4ca1-80df-870d893e53de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Received unexpected event network-vif-plugged-ceed5eec-1f8b-4201-9a2e-d23353d1ff54 for instance with vm_state deleted and task_state None.
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.230 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.231 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.308 2 DEBUG nova.compute.provider_tree [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.322 2 DEBUG nova.scheduler.client.report [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.341 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.366 2 INFO nova.scheduler.client.report [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 8b0c810b-62f3-42b5-802e-cff3319a31be
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.455 2 DEBUG oslo_concurrency.lockutils [None req-e0de11a8-b422-41dc-ba7b-3a73de16be82 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "8b0c810b-62f3-42b5-802e-cff3319a31be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:58 compute-0 nova_compute[192810]: 2025-09-30 21:43:58.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.558 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.558 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.558 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.558 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.559 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.568 2 INFO nova.compute.manager [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Terminating instance
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.579 2 DEBUG nova.compute.manager [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:43:59 compute-0 kernel: tap8e89083c-85 (unregistering): left promiscuous mode
Sep 30 21:43:59 compute-0 NetworkManager[51733]: <info>  [1759268639.6079] device (tap8e89083c-85): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 ovn_controller[94912]: 2025-09-30T21:43:59Z|00535|binding|INFO|Releasing lport 8e89083c-856f-40a4-b47f-444adcedd301 from this chassis (sb_readonly=0)
Sep 30 21:43:59 compute-0 ovn_controller[94912]: 2025-09-30T21:43:59Z|00536|binding|INFO|Setting lport 8e89083c-856f-40a4-b47f-444adcedd301 down in Southbound
Sep 30 21:43:59 compute-0 ovn_controller[94912]: 2025-09-30T21:43:59Z|00537|binding|INFO|Removing iface tap8e89083c-85 ovn-installed in OVS
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:59.624 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:32:85 10.100.0.13 2001:db8::f816:3eff:fe33:3285'], port_security=['fa:16:3e:33:32:85 10.100.0.13 2001:db8::f816:3eff:fe33:3285'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe33:3285/64', 'neutron:device_id': '3fab2833-fa74-4eb1-bef8-aa51780aa0f4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8133a1ec-543c-40bd-a8db-7ecfb4b9f0ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=59e9be54-e45c-44fd-b8d8-090d788a8eee, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=8e89083c-856f-40a4-b47f-444adcedd301) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:43:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:59.625 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 8e89083c-856f-40a4-b47f-444adcedd301 in datapath 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb unbound from our chassis
Sep 30 21:43:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:59.627 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:59.628 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4b6ea6-09d2-4cdb-99cf-d1973060c18e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:43:59 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:43:59.629 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb namespace which is not needed anymore
Sep 30 21:43:59 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008b.scope: Deactivated successfully.
Sep 30 21:43:59 compute-0 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008b.scope: Consumed 15.864s CPU time.
Sep 30 21:43:59 compute-0 systemd-machined[152794]: Machine qemu-68-instance-0000008b terminated.
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.814 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.837 2 INFO nova.virt.libvirt.driver [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Instance destroyed successfully.
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.838 2 DEBUG nova.objects.instance [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.853 2 DEBUG nova.virt.libvirt.vif [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:42:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-570822479',display_name='tempest-TestGettingAddress-server-570822479',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-570822479',id=139,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEbsRSYvS8JJesfheyHbOoMyGCfPyJ5GpWwmJctOd2uZhE42aOz5OjKoIYzt5JXLihLuVhzcSbvOK7e8JNJviPXasiSc9g8x5CPrMMEKuf701EleKp6zJuNvQR4CPNWu/A==',key_name='tempest-TestGettingAddress-696430415',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:42:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-hlf9dxg4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:42:32Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=3fab2833-fa74-4eb1-bef8-aa51780aa0f4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.854 2 DEBUG nova.network.os_vif_util [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "8e89083c-856f-40a4-b47f-444adcedd301", "address": "fa:16:3e:33:32:85", "network": {"id": "0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb", "bridge": "br-int", "label": "tempest-network-smoke--1321635209", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe33:3285", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8e89083c-85", "ovs_interfaceid": "8e89083c-856f-40a4-b47f-444adcedd301", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.854 2 DEBUG nova.network.os_vif_util [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.855 2 DEBUG os_vif [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e89083c-85, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.865 2 INFO os_vif [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:32:85,bridge_name='br-int',has_traffic_filtering=True,id=8e89083c-856f-40a4-b47f-444adcedd301,network=Network(0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8e89083c-85')
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.866 2 INFO nova.virt.libvirt.driver [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Deleting instance files /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4_del
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.866 2 INFO nova.virt.libvirt.driver [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Deletion of /var/lib/nova/instances/3fab2833-fa74-4eb1-bef8-aa51780aa0f4_del complete
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.913 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.940 2 INFO nova.compute.manager [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.941 2 DEBUG oslo.service.loopingcall [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.941 2 DEBUG nova.compute.manager [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:43:59 compute-0 nova_compute[192810]: 2025-09-30 21:43:59.942 2 DEBUG nova.network.neutron [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.027 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.028 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.054 2 DEBUG nova.compute.manager [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-unplugged-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.055 2 DEBUG oslo_concurrency.lockutils [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.055 2 DEBUG oslo_concurrency.lockutils [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.056 2 DEBUG oslo_concurrency.lockutils [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.056 2 DEBUG nova.compute.manager [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] No waiting events found dispatching network-vif-unplugged-8e89083c-856f-40a4-b47f-444adcedd301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.056 2 DEBUG nova.compute.manager [req-5b404d48-ae62-4d05-bfe6-3c2d3629b19f req-028b60dd-cb2d-496a-8f4b-3f822beabee2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-unplugged-8e89083c-856f-40a4-b47f-444adcedd301 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.087 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.090 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Error from libvirt while getting description of instance-0000008b: [Error Code 42] Domain not found: no domain with matching uuid '3fab2833-fa74-4eb1-bef8-aa51780aa0f4' (instance-0000008b): libvirt.libvirtError: Domain not found: no domain with matching uuid '3fab2833-fa74-4eb1-bef8-aa51780aa0f4' (instance-0000008b)
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.252 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.253 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5506MB free_disk=73.1921501159668GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.254 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.254 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:00 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [NOTICE]   (241534) : haproxy version is 2.8.14-c23fe91
Sep 30 21:44:00 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [NOTICE]   (241534) : path to executable is /usr/sbin/haproxy
Sep 30 21:44:00 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [WARNING]  (241534) : Exiting Master process...
Sep 30 21:44:00 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [ALERT]    (241534) : Current worker (241536) exited with code 143 (Terminated)
Sep 30 21:44:00 compute-0 neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb[241530]: [WARNING]  (241534) : All workers exited. Exiting... (0)
Sep 30 21:44:00 compute-0 systemd[1]: libpod-cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de.scope: Deactivated successfully.
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.340 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.341 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 14871acd-831a-4e81-b1b0-8d1d415e1e62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.341 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.342 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:44:00 compute-0 podman[242366]: 2025-09-30 21:44:00.342998845 +0000 UTC m=+0.630996225 container died cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.399 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.412 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.436 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.436 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:00 compute-0 nova_compute[192810]: 2025-09-30 21:44:00.998 2 DEBUG nova.network.neutron [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.003 2 DEBUG nova.compute.manager [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-changed-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.004 2 DEBUG nova.compute.manager [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing instance network info cache due to event network-changed-8e89083c-856f-40a4-b47f-444adcedd301. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.004 2 DEBUG oslo_concurrency.lockutils [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.004 2 DEBUG oslo_concurrency.lockutils [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.005 2 DEBUG nova.network.neutron [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Refreshing network info cache for port 8e89083c-856f-40a4-b47f-444adcedd301 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.029 2 INFO nova.compute.manager [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Took 1.09 seconds to deallocate network for instance.
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.099 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.100 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d711ab402cdd4de752ae9ec0c1a0970e6d1421acffa82cd737adbcdab9abbdb-merged.mount: Deactivated successfully.
Sep 30 21:44:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de-userdata-shm.mount: Deactivated successfully.
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.181 2 DEBUG nova.compute.provider_tree [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.195 2 DEBUG nova.scheduler.client.report [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.231 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.256 2 INFO nova.scheduler.client.report [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 3fab2833-fa74-4eb1-bef8-aa51780aa0f4
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.268 2 INFO nova.network.neutron [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Port 8e89083c-856f-40a4-b47f-444adcedd301 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.268 2 DEBUG nova.network.neutron [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.304 2 DEBUG oslo_concurrency.lockutils [req-1ea57bc8-88ab-4fbe-b56a-3ff9ea238e5b req-3084dedb-09f6-4646-9c45-3bc6e87c4b11 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-3fab2833-fa74-4eb1-bef8-aa51780aa0f4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.343 2 DEBUG oslo_concurrency.lockutils [None req-e031119a-0f7c-42ca-ba69-6f82a6208e1b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.432 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.433 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.451 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:01 compute-0 podman[242366]: 2025-09-30 21:44:01.661947183 +0000 UTC m=+1.949944543 container cleanup cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:44:01 compute-0 systemd[1]: libpod-conmon-cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de.scope: Deactivated successfully.
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.975 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.976 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.976 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:44:01 compute-0 nova_compute[192810]: 2025-09-30 21:44:01.976 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 14871acd-831a-4e81-b1b0-8d1d415e1e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:02 compute-0 podman[242420]: 2025-09-30 21:44:02.00329398 +0000 UTC m=+0.318614470 container remove cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.008 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc16047-3663-4b40-a852-43ce80fb5560]: (4, ('Tue Sep 30 09:43:59 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb (cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de)\ncc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de\nTue Sep 30 09:44:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb (cc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de)\ncc8fe1f958a6714e6a8558d21fdf59faadaa661fae1d47f504392467ba5f72de\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.010 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[12f536a0-4e6b-4f12-a3d0-3c90f32d69a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.011 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0670a52f-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:02 compute-0 kernel: tap0670a52f-e0: left promiscuous mode
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.026 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42b9a8de-0c24-4d95-bd18-e4b86da949f2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.054 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9d829a1a-93ad-4488-9dce-84062a2090ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.055 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a8841b00-06cc-45e2-9caa-ad0bddbe333b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.071 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[08aa1444-6358-45cd-9996-aa8f633adf32]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 524692, 'reachable_time': 34554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242436, 'error': None, 'target': 'ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.073 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0670a52f-ebb8-4bcb-bd32-ef8e3f891bdb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.073 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[6ee1ae1c-ebaf-4f66-a381-ccfd8d482cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d0670a52f\x2debb8\x2d4bcb\x2dbd32\x2def8e3f891bdb.mount: Deactivated successfully.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.218 2 DEBUG nova.compute.manager [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.219 2 DEBUG oslo_concurrency.lockutils [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.220 2 DEBUG oslo_concurrency.lockutils [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.220 2 DEBUG oslo_concurrency.lockutils [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "3fab2833-fa74-4eb1-bef8-aa51780aa0f4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.221 2 DEBUG nova.compute.manager [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] No waiting events found dispatching network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.221 2 WARNING nova.compute.manager [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received unexpected event network-vif-plugged-8e89083c-856f-40a4-b47f-444adcedd301 for instance with vm_state deleted and task_state None.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.222 2 DEBUG nova.compute.manager [req-0195cdad-5063-4dcc-87ef-f9c943f795f5 req-3d8dde29-fea8-4dde-94ee-b191fea4d7bf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Received event network-vif-deleted-8e89083c-856f-40a4-b47f-444adcedd301 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.480 2 DEBUG nova.compute.manager [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-changed-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.480 2 DEBUG nova.compute.manager [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing instance network info cache due to event network-changed-f310064e-5720-442a-ac22-71c932096d05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.481 2 DEBUG oslo_concurrency.lockutils [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.570 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.571 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.571 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.571 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.572 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.584 2 INFO nova.compute.manager [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Terminating instance
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.611 2 DEBUG nova.compute.manager [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:44:02 compute-0 kernel: tapf310064e-57 (unregistering): left promiscuous mode
Sep 30 21:44:02 compute-0 NetworkManager[51733]: <info>  [1759268642.6355] device (tapf310064e-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 ovn_controller[94912]: 2025-09-30T21:44:02Z|00538|binding|INFO|Releasing lport f310064e-5720-442a-ac22-71c932096d05 from this chassis (sb_readonly=0)
Sep 30 21:44:02 compute-0 ovn_controller[94912]: 2025-09-30T21:44:02Z|00539|binding|INFO|Setting lport f310064e-5720-442a-ac22-71c932096d05 down in Southbound
Sep 30 21:44:02 compute-0 ovn_controller[94912]: 2025-09-30T21:44:02Z|00540|binding|INFO|Removing iface tapf310064e-57 ovn-installed in OVS
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.655 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:5a:ab 10.100.0.3'], port_security=['fa:16:3e:7a:5a:ab 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '14871acd-831a-4e81-b1b0-8d1d415e1e62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e174820-c17e-4913-80eb-3b337af24ff8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb00322366384a558666220d5bb1ea75', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15edc11d-fe1e-4ba6-bc6e-3ba35452f3a6 4e9360f7-9bdc-455b-b393-ad2ca9806660', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6550d43c-4039-45a9-a6b8-639d2fdf2b9e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f310064e-5720-442a-ac22-71c932096d05) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.656 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f310064e-5720-442a-ac22-71c932096d05 in datapath 9e174820-c17e-4913-80eb-3b337af24ff8 unbound from our chassis
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.657 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e174820-c17e-4913-80eb-3b337af24ff8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.659 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d324c55f-581e-4b14-8c16-9b364c7540d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.659 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8 namespace which is not needed anymore
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000008e.scope: Deactivated successfully.
Sep 30 21:44:02 compute-0 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d0000008e.scope: Consumed 15.528s CPU time.
Sep 30 21:44:02 compute-0 systemd-machined[152794]: Machine qemu-70-instance-0000008e terminated.
Sep 30 21:44:02 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [NOTICE]   (242032) : haproxy version is 2.8.14-c23fe91
Sep 30 21:44:02 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [NOTICE]   (242032) : path to executable is /usr/sbin/haproxy
Sep 30 21:44:02 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [WARNING]  (242032) : Exiting Master process...
Sep 30 21:44:02 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [ALERT]    (242032) : Current worker (242034) exited with code 143 (Terminated)
Sep 30 21:44:02 compute-0 neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8[242028]: [WARNING]  (242032) : All workers exited. Exiting... (0)
Sep 30 21:44:02 compute-0 systemd[1]: libpod-3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c.scope: Deactivated successfully.
Sep 30 21:44:02 compute-0 podman[242459]: 2025-09-30 21:44:02.790917569 +0000 UTC m=+0.048137832 container died 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:44:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:44:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-a8bdaf780ceeffbcc48c2de4ba0d5340414652cc692cbd080ba638a01f17a8b7-merged.mount: Deactivated successfully.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 podman[242459]: 2025-09-30 21:44:02.83211127 +0000 UTC m=+0.089331533 container cleanup 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 systemd[1]: libpod-conmon-3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c.scope: Deactivated successfully.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.865 2 INFO nova.virt.libvirt.driver [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Instance destroyed successfully.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.865 2 DEBUG nova.objects.instance [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lazy-loading 'resources' on Instance uuid 14871acd-831a-4e81-b1b0-8d1d415e1e62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.883 2 DEBUG nova.virt.libvirt.vif [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:43:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-576292732-access_point-28086764',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-576292732-acc',id=142,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBa6RzlrZwzSQnYLMN5Uo66teCmEJk7j4fZqyIMs+H533SRijA8neeKUXKPPZd+DpplmTv8NjCrX68vJTURnSF9evOhuIsCkh3sge9M7z96jFRjg9S/vVrSmi9YklDKQHA==',key_name='tempest-TestSecurityGroupsBasicOps-1887830207',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:43:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='cb00322366384a558666220d5bb1ea75',ramdisk_id='',reservation_id='r-nd4s85ub',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-576292732',owner_user_name='tempest-TestSecurityGroupsBasicOps-576292732-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:43:18Z,user_data=None,user_id='bedde839dd1d47ea82605be90cfad439',uuid=14871acd-831a-4e81-b1b0-8d1d415e1e62,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.884 2 DEBUG nova.network.os_vif_util [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converting VIF {"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.884 2 DEBUG nova.network.os_vif_util [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.885 2 DEBUG os_vif [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf310064e-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.892 2 INFO os_vif [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:5a:ab,bridge_name='br-int',has_traffic_filtering=True,id=f310064e-5720-442a-ac22-71c932096d05,network=Network(9e174820-c17e-4913-80eb-3b337af24ff8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf310064e-57')
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.892 2 INFO nova.virt.libvirt.driver [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Deleting instance files /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62_del
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.893 2 INFO nova.virt.libvirt.driver [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Deletion of /var/lib/nova/instances/14871acd-831a-4e81-b1b0-8d1d415e1e62_del complete
Sep 30 21:44:02 compute-0 podman[242496]: 2025-09-30 21:44:02.894886521 +0000 UTC m=+0.042012332 container remove 3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.900 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d01a78-f50f-4edc-8502-bde0d5511a7b]: (4, ('Tue Sep 30 09:44:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8 (3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c)\n3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c\nTue Sep 30 09:44:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8 (3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c)\n3c19e91211afc626c72d9ba18ea711bcba245d1d1eae98b35ce3301fb77a711c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.901 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[77eaacca-5587-4367-8992-411cf2d59699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.902 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e174820-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 kernel: tap9e174820-c0: left promiscuous mode
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.922 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7ed378-84b5-47bb-99dc-84fbb26a43a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.953 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[60b3775c-3921-492b-a326-1798214f88d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.954 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f02cd2f5-1c45-40d6-956a-d122ca5cbd5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.969 2 INFO nova.compute.manager [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Took 0.36 seconds to destroy the instance on the hypervisor.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.970 2 DEBUG oslo.service.loopingcall [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.970 2 DEBUG nova.compute.manager [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.970 2 DEBUG nova.network.neutron [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.971 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[28345391-a375-4ee7-ab20-8a1054522029]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529252, 'reachable_time': 32426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242516, 'error': None, 'target': 'ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.974 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9e174820-c17e-4913-80eb-3b337af24ff8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:44:02 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:02.974 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4e99debb-b51f-4f9f-ab06-5023113721ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d9e174820\x2dc17e\x2d4913\x2d80eb\x2d3b337af24ff8.mount: Deactivated successfully.
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.996 2 DEBUG nova.compute.manager [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-unplugged-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.996 2 DEBUG oslo_concurrency.lockutils [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.996 2 DEBUG oslo_concurrency.lockutils [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.996 2 DEBUG oslo_concurrency.lockutils [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.997 2 DEBUG nova.compute.manager [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] No waiting events found dispatching network-vif-unplugged-f310064e-5720-442a-ac22-71c932096d05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:02 compute-0 nova_compute[192810]: 2025-09-30 21:44:02.997 2 DEBUG nova.compute.manager [req-accc1fa7-5d6f-455a-9033-0c1f27070d36 req-7bfc121c-ef6e-430a-86a0-c24d3d3f860e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-unplugged-f310064e-5720-442a-ac22-71c932096d05 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.068 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [{"id": "f310064e-5720-442a-ac22-71c932096d05", "address": "fa:16:3e:7a:5a:ab", "network": {"id": "9e174820-c17e-4913-80eb-3b337af24ff8", "bridge": "br-int", "label": "tempest-network-smoke--1903319746", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cb00322366384a558666220d5bb1ea75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf310064e-57", "ovs_interfaceid": "f310064e-5720-442a-ac22-71c932096d05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.181 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.182 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.182 2 DEBUG oslo_concurrency.lockutils [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.182 2 DEBUG nova.network.neutron [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Refreshing network info cache for port f310064e-5720-442a-ac22-71c932096d05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.303 2 DEBUG nova.compute.manager [req-44777ba8-10b3-4dad-a936-d4d4a8fe71a9 req-3a93f2b8-7ac2-4c2a-89b9-38f932325933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-deleted-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.304 2 INFO nova.compute.manager [req-44777ba8-10b3-4dad-a936-d4d4a8fe71a9 req-3a93f2b8-7ac2-4c2a-89b9-38f932325933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Neutron deleted interface f310064e-5720-442a-ac22-71c932096d05; detaching it from the instance and deleting it from the info cache
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.304 2 DEBUG nova.network.neutron [req-44777ba8-10b3-4dad-a936-d4d4a8fe71a9 req-3a93f2b8-7ac2-4c2a-89b9-38f932325933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.306 2 DEBUG nova.network.neutron [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.330 2 INFO nova.compute.manager [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Took 1.36 seconds to deallocate network for instance.
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.334 2 DEBUG nova.compute.manager [req-44777ba8-10b3-4dad-a936-d4d4a8fe71a9 req-3a93f2b8-7ac2-4c2a-89b9-38f932325933 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Detach interface failed, port_id=f310064e-5720-442a-ac22-71c932096d05, reason: Instance 14871acd-831a-4e81-b1b0-8d1d415e1e62 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.421 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.421 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.422 2 INFO nova.network.neutron [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Port f310064e-5720-442a-ac22-71c932096d05 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.423 2 DEBUG nova.network.neutron [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.434 2 DEBUG oslo_concurrency.lockutils [req-e61208c9-c869-4d71-9a2d-acb630328896 req-22c53a58-9258-47a4-ab27-804ad875b82a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-14871acd-831a-4e81-b1b0-8d1d415e1e62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.489 2 DEBUG nova.compute.provider_tree [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.531 2 DEBUG nova.scheduler.client.report [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.550 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.580 2 INFO nova.scheduler.client.report [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Deleted allocations for instance 14871acd-831a-4e81-b1b0-8d1d415e1e62
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.652 2 DEBUG oslo_concurrency.lockutils [None req-7e8097cc-59b2-43bb-b417-3ba8fc69a7b0 bedde839dd1d47ea82605be90cfad439 cb00322366384a558666220d5bb1ea75 - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:04 compute-0 nova_compute[192810]: 2025-09-30 21:44:04.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.110 2 DEBUG nova.compute.manager [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.110 2 DEBUG oslo_concurrency.lockutils [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.110 2 DEBUG oslo_concurrency.lockutils [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.111 2 DEBUG oslo_concurrency.lockutils [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "14871acd-831a-4e81-b1b0-8d1d415e1e62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.111 2 DEBUG nova.compute.manager [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] No waiting events found dispatching network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.111 2 WARNING nova.compute.manager [req-41b7d857-933c-4241-a696-480464f6a923 req-f5ccaafb-9a89-406d-9cff-997214b5c768 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Received unexpected event network-vif-plugged-f310064e-5720-442a-ac22-71c932096d05 for instance with vm_state deleted and task_state None.
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:05 compute-0 nova_compute[192810]: 2025-09-30 21:44:05.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:07 compute-0 nova_compute[192810]: 2025-09-30 21:44:07.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:07 compute-0 nova_compute[192810]: 2025-09-30 21:44:07.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:10 compute-0 podman[242519]: 2025-09-30 21:44:10.320629117 +0000 UTC m=+0.054500639 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:44:10 compute-0 podman[242520]: 2025-09-30 21:44:10.334570359 +0000 UTC m=+0.064759910 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:44:10 compute-0 podman[242518]: 2025-09-30 21:44:10.353583065 +0000 UTC m=+0.089119508 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:44:10 compute-0 nova_compute[192810]: 2025-09-30 21:44:10.893 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268635.8930023, 8b0c810b-62f3-42b5-802e-cff3319a31be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:10 compute-0 nova_compute[192810]: 2025-09-30 21:44:10.894 2 INFO nova.compute.manager [-] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] VM Stopped (Lifecycle Event)
Sep 30 21:44:10 compute-0 nova_compute[192810]: 2025-09-30 21:44:10.913 2 DEBUG nova.compute.manager [None req-cd967485-3ff1-4b0c-a3d2-353c2b3c6aee - - - - - -] [instance: 8b0c810b-62f3-42b5-802e-cff3319a31be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:12 compute-0 nova_compute[192810]: 2025-09-30 21:44:12.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:12 compute-0 nova_compute[192810]: 2025-09-30 21:44:12.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:14 compute-0 nova_compute[192810]: 2025-09-30 21:44:14.836 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268639.8346803, 3fab2833-fa74-4eb1-bef8-aa51780aa0f4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:14 compute-0 nova_compute[192810]: 2025-09-30 21:44:14.837 2 INFO nova.compute.manager [-] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] VM Stopped (Lifecycle Event)
Sep 30 21:44:14 compute-0 nova_compute[192810]: 2025-09-30 21:44:14.872 2 DEBUG nova.compute.manager [None req-ef31ccf4-584e-4a7b-b215-c86f8b857a9b - - - - - -] [instance: 3fab2833-fa74-4eb1-bef8-aa51780aa0f4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:17 compute-0 nova_compute[192810]: 2025-09-30 21:44:17.864 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268642.8629937, 14871acd-831a-4e81-b1b0-8d1d415e1e62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:17 compute-0 nova_compute[192810]: 2025-09-30 21:44:17.864 2 INFO nova.compute.manager [-] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] VM Stopped (Lifecycle Event)
Sep 30 21:44:17 compute-0 nova_compute[192810]: 2025-09-30 21:44:17.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:17 compute-0 nova_compute[192810]: 2025-09-30 21:44:17.886 2 DEBUG nova.compute.manager [None req-057aa0cf-9b25-4612-9304-20bae030af43 - - - - - -] [instance: 14871acd-831a-4e81-b1b0-8d1d415e1e62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:17 compute-0 nova_compute[192810]: 2025-09-30 21:44:17.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:18 compute-0 podman[242583]: 2025-09-30 21:44:18.315834876 +0000 UTC m=+0.056692553 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=)
Sep 30 21:44:18 compute-0 podman[242582]: 2025-09-30 21:44:18.334392311 +0000 UTC m=+0.077071043 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:44:22 compute-0 nova_compute[192810]: 2025-09-30 21:44:22.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:22 compute-0 nova_compute[192810]: 2025-09-30 21:44:22.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:26 compute-0 podman[242625]: 2025-09-30 21:44:26.307251754 +0000 UTC m=+0.042567536 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:44:26 compute-0 podman[242623]: 2025-09-30 21:44:26.307263224 +0000 UTC m=+0.049126437 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:44:26 compute-0 podman[242624]: 2025-09-30 21:44:26.311219441 +0000 UTC m=+0.049928176 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 21:44:27 compute-0 nova_compute[192810]: 2025-09-30 21:44:27.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:27 compute-0 nova_compute[192810]: 2025-09-30 21:44:27.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:31 compute-0 nova_compute[192810]: 2025-09-30 21:44:31.895 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:31 compute-0 nova_compute[192810]: 2025-09-30 21:44:31.895 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:31 compute-0 nova_compute[192810]: 2025-09-30 21:44:31.910 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.002 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.003 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.009 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.010 2 INFO nova.compute.claims [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.123 2 DEBUG nova.compute.provider_tree [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.150 2 DEBUG nova.scheduler.client.report [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.176 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.177 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.276 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.276 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.296 2 INFO nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.316 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.426 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.427 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.428 2 INFO nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Creating image(s)
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.428 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.429 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.429 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.441 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.496 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.497 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.498 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.508 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.525 2 DEBUG nova.policy [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.560 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.561 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:32 compute-0 nova_compute[192810]: 2025-09-30 21:44:32.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.063 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk 1073741824" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.064 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.064 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.117 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.119 2 DEBUG nova.virt.disk.api [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.119 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.174 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.175 2 DEBUG nova.virt.disk.api [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.178 2 DEBUG nova.objects.instance [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid b95d1bc0-5bee-4787-b4ff-a0908361ae3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.198 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.199 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Ensure instance console log exists: /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.199 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.200 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:33 compute-0 nova_compute[192810]: 2025-09-30 21:44:33.200 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:34 compute-0 nova_compute[192810]: 2025-09-30 21:44:34.095 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Successfully created port: d768580b-cf31-4877-8394-7501915ecf87 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:44:35 compute-0 nova_compute[192810]: 2025-09-30 21:44:35.576 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Successfully created port: 1ce81fa7-f0cd-41b6-bf59-835593359c9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:44:37 compute-0 nova_compute[192810]: 2025-09-30 21:44:37.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:37 compute-0 nova_compute[192810]: 2025-09-30 21:44:37.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.572 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Successfully updated port: d768580b-cf31-4877-8394-7501915ecf87 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:44:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:38.749 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:38.750 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:38.750 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.940 2 DEBUG nova.compute.manager [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-changed-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.941 2 DEBUG nova.compute.manager [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing instance network info cache due to event network-changed-d768580b-cf31-4877-8394-7501915ecf87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.941 2 DEBUG oslo_concurrency.lockutils [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.941 2 DEBUG oslo_concurrency.lockutils [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:38 compute-0 nova_compute[192810]: 2025-09-30 21:44:38.942 2 DEBUG nova.network.neutron [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing network info cache for port d768580b-cf31-4877-8394-7501915ecf87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:39 compute-0 nova_compute[192810]: 2025-09-30 21:44:39.486 2 DEBUG nova.network.neutron [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:44:39 compute-0 nova_compute[192810]: 2025-09-30 21:44:39.818 2 DEBUG nova.network.neutron [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:39 compute-0 nova_compute[192810]: 2025-09-30 21:44:39.832 2 DEBUG oslo_concurrency.lockutils [req-bc2b6099-9099-456f-ad1e-366947bb424c req-676ea725-752a-4cfc-84b0-6845c411e90c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:40 compute-0 nova_compute[192810]: 2025-09-30 21:44:40.773 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Successfully updated port: 1ce81fa7-f0cd-41b6-bf59-835593359c9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:44:40 compute-0 nova_compute[192810]: 2025-09-30 21:44:40.793 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:40 compute-0 nova_compute[192810]: 2025-09-30 21:44:40.794 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:40 compute-0 nova_compute[192810]: 2025-09-30 21:44:40.794 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:44:40 compute-0 nova_compute[192810]: 2025-09-30 21:44:40.984 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:44:41 compute-0 podman[242698]: 2025-09-30 21:44:41.322275497 +0000 UTC m=+0.051304430 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:44:41 compute-0 podman[242699]: 2025-09-30 21:44:41.358129207 +0000 UTC m=+0.084862613 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:44:41 compute-0 podman[242697]: 2025-09-30 21:44:41.360043854 +0000 UTC m=+0.092009939 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Sep 30 21:44:41 compute-0 nova_compute[192810]: 2025-09-30 21:44:41.900 2 DEBUG nova.compute.manager [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-changed-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:41 compute-0 nova_compute[192810]: 2025-09-30 21:44:41.901 2 DEBUG nova.compute.manager [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing instance network info cache due to event network-changed-1ce81fa7-f0cd-41b6-bf59-835593359c9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:41 compute-0 nova_compute[192810]: 2025-09-30 21:44:41.901 2 DEBUG oslo_concurrency.lockutils [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:42 compute-0 nova_compute[192810]: 2025-09-30 21:44:42.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:42 compute-0 nova_compute[192810]: 2025-09-30 21:44:42.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.725 2 DEBUG nova.network.neutron [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.763 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.763 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance network_info: |[{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.764 2 DEBUG oslo_concurrency.lockutils [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.764 2 DEBUG nova.network.neutron [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing network info cache for port 1ce81fa7-f0cd-41b6-bf59-835593359c9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.767 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Start _get_guest_xml network_info=[{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.771 2 WARNING nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.775 2 DEBUG nova.virt.libvirt.host [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.776 2 DEBUG nova.virt.libvirt.host [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.779 2 DEBUG nova.virt.libvirt.host [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.780 2 DEBUG nova.virt.libvirt.host [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.781 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.781 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.782 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.782 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.782 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.783 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.783 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.783 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.784 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.784 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.784 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.784 2 DEBUG nova.virt.hardware [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.789 2 DEBUG nova.virt.libvirt.vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:32Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.790 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.790 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.792 2 DEBUG nova.virt.libvirt.vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:32Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.792 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.793 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.794 2 DEBUG nova.objects.instance [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid b95d1bc0-5bee-4787-b4ff-a0908361ae3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.810 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <uuid>b95d1bc0-5bee-4787-b4ff-a0908361ae3e</uuid>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <name>instance-00000092</name>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-1273749312</nova:name>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:44:43</nova:creationTime>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:port uuid="d768580b-cf31-4877-8394-7501915ecf87">
Sep 30 21:44:43 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         <nova:port uuid="1ce81fa7-f0cd-41b6-bf59-835593359c9f">
Sep 30 21:44:43 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fef0:5b98" ipVersion="6"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <system>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="serial">b95d1bc0-5bee-4787-b4ff-a0908361ae3e</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="uuid">b95d1bc0-5bee-4787-b4ff-a0908361ae3e</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </system>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <os>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </os>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <features>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </features>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.config"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:b1:eb:aa"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <target dev="tapd768580b-cf"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:f0:5b:98"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <target dev="tap1ce81fa7-f0"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/console.log" append="off"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <video>
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </video>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:44:43 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:44:43 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:44:43 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:44:43 compute-0 nova_compute[192810]: </domain>
Sep 30 21:44:43 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.811 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Preparing to wait for external event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.812 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.812 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.812 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.813 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Preparing to wait for external event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.813 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.813 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.813 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.814 2 DEBUG nova.virt.libvirt.vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:32Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.814 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.814 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.815 2 DEBUG os_vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.816 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.819 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd768580b-cf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.819 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd768580b-cf, col_values=(('external_ids', {'iface-id': 'd768580b-cf31-4877-8394-7501915ecf87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:eb:aa', 'vm-uuid': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 NetworkManager[51733]: <info>  [1759268683.8222] manager: (tapd768580b-cf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.827 2 INFO os_vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf')
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.827 2 DEBUG nova.virt.libvirt.vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:44:32Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.828 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.828 2 DEBUG nova.network.os_vif_util [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.828 2 DEBUG os_vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ce81fa7-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.831 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ce81fa7-f0, col_values=(('external_ids', {'iface-id': '1ce81fa7-f0cd-41b6-bf59-835593359c9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:5b:98', 'vm-uuid': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 NetworkManager[51733]: <info>  [1759268683.8332] manager: (tap1ce81fa7-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/238)
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.840 2 INFO os_vif [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0')
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e', 'name': 'tempest-TestGettingAddress-server-1273749312', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000092', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'hostId': '4e27c1640aee900167d620f76b94d293e6d0bfe9605518123fccc2d8', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.912 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.913 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.913 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.913 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>]
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.914 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.915 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.915 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.916 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.916 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.917 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.918 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.918 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.918 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>]
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.919 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.919 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>]
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.921 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.922 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.923 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.925 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.926 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.927 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.927 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.928 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.929 12 DEBUG ceilometer.compute.pollsters [-] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000092, id=b95d1bc0-5bee-4787-b4ff-a0908361ae3e>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.929 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:44:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:44:43.929 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1273749312>]
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.936 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.936 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.936 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:b1:eb:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.937 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:f0:5b:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:44:43 compute-0 nova_compute[192810]: 2025-09-30 21:44:43.937 2 INFO nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Using config drive
Sep 30 21:44:44 compute-0 nova_compute[192810]: 2025-09-30 21:44:44.778 2 INFO nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Creating config drive at /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.config
Sep 30 21:44:44 compute-0 nova_compute[192810]: 2025-09-30 21:44:44.782 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmfkc284v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:44:44 compute-0 nova_compute[192810]: 2025-09-30 21:44:44.908 2 DEBUG oslo_concurrency.processutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmfkc284v" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:44:44 compute-0 NetworkManager[51733]: <info>  [1759268684.9665] manager: (tapd768580b-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/239)
Sep 30 21:44:44 compute-0 kernel: tapd768580b-cf: entered promiscuous mode
Sep 30 21:44:44 compute-0 ovn_controller[94912]: 2025-09-30T21:44:44Z|00541|binding|INFO|Claiming lport d768580b-cf31-4877-8394-7501915ecf87 for this chassis.
Sep 30 21:44:44 compute-0 ovn_controller[94912]: 2025-09-30T21:44:44Z|00542|binding|INFO|d768580b-cf31-4877-8394-7501915ecf87: Claiming fa:16:3e:b1:eb:aa 10.100.0.3
Sep 30 21:44:44 compute-0 nova_compute[192810]: 2025-09-30 21:44:44.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:44 compute-0 NetworkManager[51733]: <info>  [1759268684.9831] manager: (tap1ce81fa7-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/240)
Sep 30 21:44:44 compute-0 kernel: tap1ce81fa7-f0: entered promiscuous mode
Sep 30 21:44:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:44.990 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:eb:aa 10.100.0.3'], port_security=['fa:16:3e:b1:eb:aa 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42bc7c38-1ece-4458-bbe7-6a2159483355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07285f81-9b17-4ece-beed-7a792b1d2f3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4dea503-319e-441b-ba9c-fea1c678426f, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=d768580b-cf31-4877-8394-7501915ecf87) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:44.991 103867 INFO neutron.agent.ovn.metadata.agent [-] Port d768580b-cf31-4877-8394-7501915ecf87 in datapath 42bc7c38-1ece-4458-bbe7-6a2159483355 bound to our chassis
Sep 30 21:44:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:44.993 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42bc7c38-1ece-4458-bbe7-6a2159483355
Sep 30 21:44:45 compute-0 systemd-udevd[242787]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:44:45 compute-0 systemd-udevd[242785]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.005 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d979f6-070d-40ae-bccd-9662980831c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.006 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42bc7c38-11 in ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.008 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42bc7c38-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.008 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c5d95c-c521-4ffb-b0b4-ba1bec14a096]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.009 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[008b6839-bfbb-4ecd-a711-1e2723ed1a56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.0144] device (tap1ce81fa7-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.0150] device (tapd768580b-cf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.0162] device (tap1ce81fa7-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.0165] device (tapd768580b-cf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.022 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5789ac27-6da1-459b-9c0f-7b386e2ae6a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 systemd-machined[152794]: New machine qemu-71-instance-00000092.
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.047 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d946a3ab-564e-4940-a566-319b7dbabdaa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00543|binding|INFO|Claiming lport 1ce81fa7-f0cd-41b6-bf59-835593359c9f for this chassis.
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00544|binding|INFO|1ce81fa7-f0cd-41b6-bf59-835593359c9f: Claiming fa:16:3e:f0:5b:98 2001:db8::f816:3eff:fef0:5b98
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 systemd[1]: Started Virtual Machine qemu-71-instance-00000092.
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00545|binding|INFO|Setting lport d768580b-cf31-4877-8394-7501915ecf87 ovn-installed in OVS
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00546|binding|INFO|Setting lport 1ce81fa7-f0cd-41b6-bf59-835593359c9f ovn-installed in OVS
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.082 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d9996e95-9af7-4150-bf62-b113d18e8eb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.0893] manager: (tap42bc7c38-10): new Veth device (/org/freedesktop/NetworkManager/Devices/241)
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.089 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[59c82dae-a016-4948-862d-eb634d199157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.120 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:5b:98 2001:db8::f816:3eff:fef0:5b98'], port_security=['fa:16:3e:f0:5b:98 2001:db8::f816:3eff:fef0:5b98'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:5b98/64', 'neutron:device_id': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '07285f81-9b17-4ece-beed-7a792b1d2f3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91467be9-be1a-467e-bb76-f834d9445de1, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=1ce81fa7-f0cd-41b6-bf59-835593359c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00547|binding|INFO|Setting lport 1ce81fa7-f0cd-41b6-bf59-835593359c9f up in Southbound
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00548|binding|INFO|Setting lport d768580b-cf31-4877-8394-7501915ecf87 up in Southbound
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.122 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1b99c86e-5ffd-48cf-bd8d-4a8f21c9e29d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.125 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f62cc7-fe56-49d8-acaf-d290ad8a3450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.1484] device (tap42bc7c38-10): carrier: link connected
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.153 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab7023b-6207-4eec-9e2f-dc40ee4a6951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.169 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa23a09f-d2a5-411e-aaa2-b6a474c96a27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42bc7c38-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538077, 'reachable_time': 41133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242822, 'error': None, 'target': 'ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.184 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[07eea375-4900-463c-9302-91da6b8d73e0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:fd3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538077, 'tstamp': 538077}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242823, 'error': None, 'target': 'ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.201 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2acce270-113f-47d5-9b52-c4002708edf0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42bc7c38-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:fd:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 166], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538077, 'reachable_time': 41133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242824, 'error': None, 'target': 'ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.231 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[69101ead-bebf-4530-a8a9-34e7c7e1498c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.283 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[da159ce4-6212-413f-9159-593384257f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.285 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42bc7c38-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.285 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.285 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42bc7c38-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:45 compute-0 kernel: tap42bc7c38-10: entered promiscuous mode
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 NetworkManager[51733]: <info>  [1759268685.2879] manager: (tap42bc7c38-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/242)
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.290 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42bc7c38-10, col_values=(('external_ids', {'iface-id': '227142f6-4b16-4f5b-a481-3e42b58e84ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_controller[94912]: 2025-09-30T21:44:45Z|00549|binding|INFO|Releasing lport 227142f6-4b16-4f5b-a481-3e42b58e84ee from this chassis (sb_readonly=0)
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.292 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42bc7c38-1ece-4458-bbe7-6a2159483355.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42bc7c38-1ece-4458-bbe7-6a2159483355.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.293 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[18e05952-87fb-4ba2-aa18-2a4d241fe5f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.294 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-42bc7c38-1ece-4458-bbe7-6a2159483355
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/42bc7c38-1ece-4458-bbe7-6a2159483355.pid.haproxy
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 42bc7c38-1ece-4458-bbe7-6a2159483355
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:44:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:45.294 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355', 'env', 'PROCESS_TAG=haproxy-42bc7c38-1ece-4458-bbe7-6a2159483355', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42bc7c38-1ece-4458-bbe7-6a2159483355.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:45 compute-0 podman[242864]: 2025-09-30 21:44:45.625317188 +0000 UTC m=+0.033658917 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.885 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268685.885002, b95d1bc0-5bee-4787-b4ff-a0908361ae3e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.886 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] VM Started (Lifecycle Event)
Sep 30 21:44:45 compute-0 podman[242864]: 2025-09-30 21:44:45.915743285 +0000 UTC m=+0.324084994 container create bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.943 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.948 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268685.8862705, b95d1bc0-5bee-4787-b4ff-a0908361ae3e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:45 compute-0 nova_compute[192810]: 2025-09-30 21:44:45.949 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] VM Paused (Lifecycle Event)
Sep 30 21:44:46 compute-0 systemd[1]: Started libpod-conmon-bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61.scope.
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.077 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.081 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:46 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:44:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b25ad471f863014ab41b298be63725593b153ed2def4d4612540a50bddb1fb10/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.208 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:46 compute-0 podman[242864]: 2025-09-30 21:44:46.273607898 +0000 UTC m=+0.681949777 container init bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, tcib_managed=true, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:44:46 compute-0 podman[242864]: 2025-09-30 21:44:46.280255201 +0000 UTC m=+0.688596940 container start bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 21:44:46 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [NOTICE]   (242883) : New worker (242885) forked
Sep 30 21:44:46 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [NOTICE]   (242883) : Loading success.
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.357 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 1ce81fa7-f0cd-41b6-bf59-835593359c9f in datapath 2d27a62d-95fd-4cbb-bc4d-20b0757663cc unbound from our chassis
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.359 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2d27a62d-95fd-4cbb-bc4d-20b0757663cc
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.370 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e974c94a-5973-40fd-b3a6-dabeb60b9261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.372 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2d27a62d-91 in ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.373 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2d27a62d-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.373 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bdef9929-bf8f-4450-81d1-7dfc6cdbeadd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.374 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed46704-c50d-4295-b002-4860aee7202d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.385 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b227c2fb-a406-41f2-8a30-d2bd22fe683d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.409 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e93605a9-1bf6-4702-bf16-cd3b7bfe7c5b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.452 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[14833761-9f86-46fe-859d-c6e60cab0b63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.457 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9769cd-6a66-4278-ac9a-87924451f7e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 systemd-udevd[242810]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:44:46 compute-0 NetworkManager[51733]: <info>  [1759268686.4598] manager: (tap2d27a62d-90): new Veth device (/org/freedesktop/NetworkManager/Devices/243)
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.488 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f40e8027-c1a0-4e5a-84c1-e9e36652abdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.492 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7b6c449d-f710-48e2-ba28-4bcb55790e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 NetworkManager[51733]: <info>  [1759268686.5276] device (tap2d27a62d-90): carrier: link connected
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.537 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[41516e50-13bb-4ad7-b0bc-0a08496bc370]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.555 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8dbe37-0989-46c4-90e9-e4494dacbe7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d27a62d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:76:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538215, 'reachable_time': 44005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242904, 'error': None, 'target': 'ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.571 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[abd20826-4b97-4a05-bcee-51043db46d61]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:7626'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 538215, 'tstamp': 538215}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242905, 'error': None, 'target': 'ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.587 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9b70cf-8617-49f7-8a67-abdf1d9c1d93]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2d27a62d-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:76:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 167], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538215, 'reachable_time': 44005, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242906, 'error': None, 'target': 'ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.624 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd12835-be03-4e6d-ad92-e72d9d7763ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.672 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9c9844-e604-49cd-afb5-275a8607e3ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.674 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d27a62d-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.674 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.675 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2d27a62d-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.708 2 DEBUG nova.compute.manager [req-db22ad33-a1f5-4294-a96c-cf7ac9a6fa3d req-3418da47-5b50-4c14-b97d-24eaad0b24f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.709 2 DEBUG oslo_concurrency.lockutils [req-db22ad33-a1f5-4294-a96c-cf7ac9a6fa3d req-3418da47-5b50-4c14-b97d-24eaad0b24f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.709 2 DEBUG oslo_concurrency.lockutils [req-db22ad33-a1f5-4294-a96c-cf7ac9a6fa3d req-3418da47-5b50-4c14-b97d-24eaad0b24f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.709 2 DEBUG oslo_concurrency.lockutils [req-db22ad33-a1f5-4294-a96c-cf7ac9a6fa3d req-3418da47-5b50-4c14-b97d-24eaad0b24f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.710 2 DEBUG nova.compute.manager [req-db22ad33-a1f5-4294-a96c-cf7ac9a6fa3d req-3418da47-5b50-4c14-b97d-24eaad0b24f5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Processing event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 NetworkManager[51733]: <info>  [1759268686.7145] manager: (tap2d27a62d-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/244)
Sep 30 21:44:46 compute-0 kernel: tap2d27a62d-90: entered promiscuous mode
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.718 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2d27a62d-90, col_values=(('external_ids', {'iface-id': 'dd6bfcb5-ae09-4e02-b34a-7b6921b33bb7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:46 compute-0 ovn_controller[94912]: 2025-09-30T21:44:46Z|00550|binding|INFO|Releasing lport dd6bfcb5-ae09-4e02-b34a-7b6921b33bb7 from this chassis (sb_readonly=0)
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.720 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2d27a62d-95fd-4cbb-bc4d-20b0757663cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2d27a62d-95fd-4cbb-bc4d-20b0757663cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.721 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[16a1e91d-e2c0-4a42-9c22-737acf967a41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.721 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-2d27a62d-95fd-4cbb-bc4d-20b0757663cc
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/2d27a62d-95fd-4cbb-bc4d-20b0757663cc.pid.haproxy
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 2d27a62d-95fd-4cbb-bc4d-20b0757663cc
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.722 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'env', 'PROCESS_TAG=haproxy-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2d27a62d-95fd-4cbb-bc4d-20b0757663cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:46.860 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.965 2 DEBUG nova.compute.manager [req-010da8d6-5ace-47a4-ab3a-ea0ea1af6cae req-e3873953-36f6-402e-af58-ff9b0a946d36 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.965 2 DEBUG oslo_concurrency.lockutils [req-010da8d6-5ace-47a4-ab3a-ea0ea1af6cae req-e3873953-36f6-402e-af58-ff9b0a946d36 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.965 2 DEBUG oslo_concurrency.lockutils [req-010da8d6-5ace-47a4-ab3a-ea0ea1af6cae req-e3873953-36f6-402e-af58-ff9b0a946d36 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.965 2 DEBUG oslo_concurrency.lockutils [req-010da8d6-5ace-47a4-ab3a-ea0ea1af6cae req-e3873953-36f6-402e-af58-ff9b0a946d36 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.966 2 DEBUG nova.compute.manager [req-010da8d6-5ace-47a4-ab3a-ea0ea1af6cae req-e3873953-36f6-402e-af58-ff9b0a946d36 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Processing event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.966 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.969 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268686.9696891, b95d1bc0-5bee-4787-b4ff-a0908361ae3e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.970 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] VM Resumed (Lifecycle Event)
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.971 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.975 2 INFO nova.virt.libvirt.driver [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance spawned successfully.
Sep 30 21:44:46 compute-0 nova_compute[192810]: 2025-09-30 21:44:46.975 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.038 2 DEBUG nova.network.neutron [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updated VIF entry in instance network info cache for port 1ce81fa7-f0cd-41b6-bf59-835593359c9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.039 2 DEBUG nova.network.neutron [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.064 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.070 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.070 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.070 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.071 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.071 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.072 2 DEBUG nova.virt.libvirt.driver [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.075 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:44:47 compute-0 podman[242935]: 2025-09-30 21:44:47.127083542 +0000 UTC m=+0.071957657 container create ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:44:47 compute-0 systemd[1]: Started libpod-conmon-ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8.scope.
Sep 30 21:44:47 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:44:47 compute-0 podman[242935]: 2025-09-30 21:44:47.095672032 +0000 UTC m=+0.040546167 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:44:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3e0f4c9ae5bf5c1b577e34ebc9b216ee4352009d4746614076d29deb1bbb6bd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:44:47 compute-0 podman[242935]: 2025-09-30 21:44:47.306588708 +0000 UTC m=+0.251462823 container init ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:44:47 compute-0 podman[242935]: 2025-09-30 21:44:47.312345389 +0000 UTC m=+0.257219504 container start ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:44:47 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [NOTICE]   (242954) : New worker (242956) forked
Sep 30 21:44:47 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [NOTICE]   (242954) : Loading success.
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.374 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.376 2 DEBUG oslo_concurrency.lockutils [req-759c416a-be37-41fd-907c-babad949c25c req-630b7e70-92ac-425b-9778-178cc2fa0b7e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:47.446 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.646 2 INFO nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Took 15.22 seconds to spawn the instance on the hypervisor.
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.646 2 DEBUG nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:44:47 compute-0 nova_compute[192810]: 2025-09-30 21:44:47.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.065 2 INFO nova.compute.manager [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Took 16.11 seconds to build instance.
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.185 2 DEBUG oslo_concurrency.lockutils [None req-b79e318a-67e9-451e-949c-4aa71f9bd51b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.847 2 DEBUG nova.compute.manager [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.847 2 DEBUG oslo_concurrency.lockutils [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.848 2 DEBUG oslo_concurrency.lockutils [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.848 2 DEBUG oslo_concurrency.lockutils [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.848 2 DEBUG nova.compute.manager [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:48 compute-0 nova_compute[192810]: 2025-09-30 21:44:48.848 2 WARNING nova.compute.manager [req-5315206d-c4c1-4c44-be69-1b3c0070392f req-b34c4ef1-ff31-42f6-9146-a700934230c5 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received unexpected event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f for instance with vm_state active and task_state None.
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.117 2 DEBUG nova.compute.manager [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.118 2 DEBUG oslo_concurrency.lockutils [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.118 2 DEBUG oslo_concurrency.lockutils [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.118 2 DEBUG oslo_concurrency.lockutils [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.118 2 DEBUG nova.compute.manager [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:49 compute-0 nova_compute[192810]: 2025-09-30 21:44:49.118 2 WARNING nova.compute.manager [req-61391d05-da00-4268-8b3b-8ee842b985d7 req-79bf7b3e-aa44-4a72-a69c-4e159996e93f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received unexpected event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 for instance with vm_state active and task_state None.
Sep 30 21:44:49 compute-0 podman[242965]: 2025-09-30 21:44:49.307045222 +0000 UTC m=+0.045911248 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:44:49 compute-0 podman[242966]: 2025-09-30 21:44:49.326344695 +0000 UTC m=+0.059649845 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:44:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:44:50.450 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:52 compute-0 NetworkManager[51733]: <info>  [1759268692.3266] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Sep 30 21:44:52 compute-0 NetworkManager[51733]: <info>  [1759268692.3275] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/246)
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.418 2 DEBUG nova.compute.manager [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:52 compute-0 ovn_controller[94912]: 2025-09-30T21:44:52Z|00551|binding|INFO|Releasing lport 227142f6-4b16-4f5b-a481-3e42b58e84ee from this chassis (sb_readonly=0)
Sep 30 21:44:52 compute-0 ovn_controller[94912]: 2025-09-30T21:44:52Z|00552|binding|INFO|Releasing lport dd6bfcb5-ae09-4e02-b34a-7b6921b33bb7 from this chassis (sb_readonly=0)
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.532 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.533 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.553 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'pci_requests' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.569 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.570 2 INFO nova.compute.claims [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.570 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'resources' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.586 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'numa_topology' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.604 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.637 2 DEBUG nova.compute.manager [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-changed-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.637 2 DEBUG nova.compute.manager [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing instance network info cache due to event network-changed-d768580b-cf31-4877-8394-7501915ecf87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.638 2 DEBUG oslo_concurrency.lockutils [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.638 2 DEBUG oslo_concurrency.lockutils [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.638 2 DEBUG nova.network.neutron [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing network info cache for port d768580b-cf31-4877-8394-7501915ecf87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.665 2 INFO nova.compute.resource_tracker [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating resource usage from migration db92c3ea-df74-490b-9ee5-b72fdac10a6a
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.665 2 DEBUG nova.compute.resource_tracker [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Starting to track incoming migration db92c3ea-df74-490b-9ee5-b72fdac10a6a with flavor afe5c12d-500a-499b-9438-9e9c37698acc _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.757 2 DEBUG nova.compute.provider_tree [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.776 2 DEBUG nova.scheduler.client.report [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.814 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.815 2 INFO nova.compute.manager [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Migrating
Sep 30 21:44:52 compute-0 nova_compute[192810]: 2025-09-30 21:44:52.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:53 compute-0 nova_compute[192810]: 2025-09-30 21:44:53.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:54 compute-0 sshd-session[243011]: Accepted publickey for nova from 192.168.122.101 port 39834 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:44:54 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:44:54 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:44:54 compute-0 systemd-logind[792]: New session 34 of user nova.
Sep 30 21:44:54 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:44:54 compute-0 nova_compute[192810]: 2025-09-30 21:44:54.524 2 DEBUG nova.network.neutron [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updated VIF entry in instance network info cache for port d768580b-cf31-4877-8394-7501915ecf87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:44:54 compute-0 nova_compute[192810]: 2025-09-30 21:44:54.525 2 DEBUG nova.network.neutron [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:44:54 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:44:54 compute-0 nova_compute[192810]: 2025-09-30 21:44:54.552 2 DEBUG oslo_concurrency.lockutils [req-658b0f0a-aa88-49aa-af1b-1b010dd8a9e3 req-e0906dc2-47de-4782-9cf8-0ac7df361eea dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:44:54 compute-0 systemd[243015]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:54 compute-0 systemd[243015]: Queued start job for default target Main User Target.
Sep 30 21:44:54 compute-0 systemd[243015]: Created slice User Application Slice.
Sep 30 21:44:54 compute-0 systemd[243015]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:44:54 compute-0 systemd[243015]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:44:54 compute-0 systemd[243015]: Reached target Paths.
Sep 30 21:44:54 compute-0 systemd[243015]: Reached target Timers.
Sep 30 21:44:54 compute-0 systemd[243015]: Starting D-Bus User Message Bus Socket...
Sep 30 21:44:54 compute-0 systemd[243015]: Starting Create User's Volatile Files and Directories...
Sep 30 21:44:54 compute-0 systemd[243015]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:44:54 compute-0 systemd[243015]: Reached target Sockets.
Sep 30 21:44:54 compute-0 systemd[243015]: Finished Create User's Volatile Files and Directories.
Sep 30 21:44:54 compute-0 systemd[243015]: Reached target Basic System.
Sep 30 21:44:54 compute-0 systemd[243015]: Reached target Main User Target.
Sep 30 21:44:54 compute-0 systemd[243015]: Startup finished in 164ms.
Sep 30 21:44:54 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:44:54 compute-0 systemd[1]: Started Session 34 of User nova.
Sep 30 21:44:54 compute-0 sshd-session[243011]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:54 compute-0 nova_compute[192810]: 2025-09-30 21:44:54.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:54 compute-0 nova_compute[192810]: 2025-09-30 21:44:54.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:44:54 compute-0 sshd-session[243031]: Received disconnect from 192.168.122.101 port 39834:11: disconnected by user
Sep 30 21:44:54 compute-0 sshd-session[243031]: Disconnected from user nova 192.168.122.101 port 39834
Sep 30 21:44:54 compute-0 sshd-session[243011]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:44:54 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Sep 30 21:44:54 compute-0 systemd-logind[792]: Session 34 logged out. Waiting for processes to exit.
Sep 30 21:44:54 compute-0 systemd-logind[792]: Removed session 34.
Sep 30 21:44:54 compute-0 sshd-session[243033]: Accepted publickey for nova from 192.168.122.101 port 39850 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:44:54 compute-0 systemd-logind[792]: New session 36 of user nova.
Sep 30 21:44:54 compute-0 systemd[1]: Started Session 36 of User nova.
Sep 30 21:44:54 compute-0 sshd-session[243033]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:55 compute-0 sshd-session[243036]: Received disconnect from 192.168.122.101 port 39850:11: disconnected by user
Sep 30 21:44:55 compute-0 sshd-session[243036]: Disconnected from user nova 192.168.122.101 port 39850
Sep 30 21:44:55 compute-0 sshd-session[243033]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:44:55 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Sep 30 21:44:55 compute-0 systemd-logind[792]: Session 36 logged out. Waiting for processes to exit.
Sep 30 21:44:55 compute-0 systemd-logind[792]: Removed session 36.
Sep 30 21:44:56 compute-0 nova_compute[192810]: 2025-09-30 21:44:56.799 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:56 compute-0 nova_compute[192810]: 2025-09-30 21:44:56.800 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:56 compute-0 nova_compute[192810]: 2025-09-30 21:44:56.800 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:44:56 compute-0 nova_compute[192810]: 2025-09-30 21:44:56.879 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:44:57 compute-0 podman[243038]: 2025-09-30 21:44:57.343303839 +0000 UTC m=+0.072907800 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:44:57 compute-0 podman[243039]: 2025-09-30 21:44:57.356365859 +0000 UTC m=+0.085725764 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:44:57 compute-0 podman[243040]: 2025-09-30 21:44:57.363880264 +0000 UTC m=+0.092167063 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:44:57 compute-0 nova_compute[192810]: 2025-09-30 21:44:57.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.142 2 DEBUG nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.142 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.142 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.143 2 DEBUG oslo_concurrency.lockutils [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.143 2 DEBUG nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.143 2 WARNING nova.compute.manager [req-6d5545aa-4df4-475b-96b2-8fcd357be1d3 req-c5025909-7aa5-4a82-9b77-041dad7d1405 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:44:58 compute-0 sshd-session[243100]: Accepted publickey for nova from 192.168.122.101 port 41580 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:44:58 compute-0 systemd-logind[792]: New session 37 of user nova.
Sep 30 21:44:58 compute-0 systemd[1]: Started Session 37 of User nova.
Sep 30 21:44:58 compute-0 sshd-session[243100]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:58 compute-0 nova_compute[192810]: 2025-09-30 21:44:58.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:44:58 compute-0 sshd-session[243103]: Received disconnect from 192.168.122.101 port 41580:11: disconnected by user
Sep 30 21:44:58 compute-0 sshd-session[243103]: Disconnected from user nova 192.168.122.101 port 41580
Sep 30 21:44:58 compute-0 sshd-session[243100]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:44:58 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Sep 30 21:44:58 compute-0 systemd-logind[792]: Session 37 logged out. Waiting for processes to exit.
Sep 30 21:44:58 compute-0 systemd-logind[792]: Removed session 37.
Sep 30 21:44:59 compute-0 sshd-session[243106]: Accepted publickey for nova from 192.168.122.101 port 41590 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:44:59 compute-0 systemd-logind[792]: New session 38 of user nova.
Sep 30 21:44:59 compute-0 systemd[1]: Started Session 38 of User nova.
Sep 30 21:44:59 compute-0 sshd-session[243106]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:59 compute-0 sshd-session[243123]: Received disconnect from 192.168.122.101 port 41590:11: disconnected by user
Sep 30 21:44:59 compute-0 sshd-session[243123]: Disconnected from user nova 192.168.122.101 port 41590
Sep 30 21:44:59 compute-0 sshd-session[243106]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:44:59 compute-0 systemd[1]: session-38.scope: Deactivated successfully.
Sep 30 21:44:59 compute-0 systemd-logind[792]: Session 38 logged out. Waiting for processes to exit.
Sep 30 21:44:59 compute-0 systemd-logind[792]: Removed session 38.
Sep 30 21:44:59 compute-0 sshd-session[243125]: Accepted publickey for nova from 192.168.122.101 port 41600 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:44:59 compute-0 systemd-logind[792]: New session 39 of user nova.
Sep 30 21:44:59 compute-0 systemd[1]: Started Session 39 of User nova.
Sep 30 21:44:59 compute-0 sshd-session[243125]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:44:59 compute-0 sshd-session[243134]: Received disconnect from 192.168.122.101 port 41600:11: disconnected by user
Sep 30 21:44:59 compute-0 sshd-session[243134]: Disconnected from user nova 192.168.122.101 port 41600
Sep 30 21:44:59 compute-0 sshd-session[243125]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:44:59 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Sep 30 21:44:59 compute-0 systemd-logind[792]: Session 39 logged out. Waiting for processes to exit.
Sep 30 21:44:59 compute-0 systemd-logind[792]: Removed session 39.
Sep 30 21:44:59 compute-0 nova_compute[192810]: 2025-09-30 21:44:59.867 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:44:59 compute-0 nova_compute[192810]: 2025-09-30 21:44:59.868 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.308 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.309 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.309 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.309 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.309 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.310 2 WARNING nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.310 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.310 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.310 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.311 2 DEBUG oslo_concurrency.lockutils [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.311 2 DEBUG nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.311 2 WARNING nova.compute.manager [req-996e3cfc-31a1-46ff-b039-e7f0078da40b req-a1900914-cb0a-40bd-b0d5-eb52928b6f30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrating.
Sep 30 21:45:00 compute-0 ovn_controller[94912]: 2025-09-30T21:45:00Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:eb:aa 10.100.0.3
Sep 30 21:45:00 compute-0 ovn_controller[94912]: 2025-09-30T21:45:00Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:eb:aa 10.100.0.3
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:00 compute-0 nova_compute[192810]: 2025-09-30 21:45:00.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.527 2 INFO nova.network.neutron [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.822 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.824 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.906 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.963 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:01 compute-0 nova_compute[192810]: 2025-09-30 21:45:01.964 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.020 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.198 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.199 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5508MB free_disk=73.17616653442383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.200 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.200 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.275 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Applying migration context for instance 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 as it has an incoming, in-progress migration db92c3ea-df74-490b-9ee5-b72fdac10a6a. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.277 2 INFO nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating resource usage from migration db92c3ea-df74-490b-9ee5-b72fdac10a6a
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.342 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.343 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.343 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.343 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.420 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.436 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.454 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.454 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.460 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.461 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.461 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.462 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.462 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.462 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.462 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.462 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.463 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.463 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.463 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.463 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.463 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.464 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.464 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.464 2 DEBUG oslo_concurrency.lockutils [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.464 2 DEBUG nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.464 2 WARNING nova.compute.manager [req-1d0d700f-1c7b-491c-9c71-d4e4f10493bf req-4cae4fd9-02a8-4542-928e-f06ebe1f147f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_migrated.
Sep 30 21:45:02 compute-0 nova_compute[192810]: 2025-09-30 21:45:02.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.455 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.456 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.456 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.479 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.479 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.479 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.480 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.668 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:03 compute-0 nova_compute[192810]: 2025-09-30 21:45:03.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:04 compute-0 nova_compute[192810]: 2025-09-30 21:45:04.546 2 DEBUG nova.compute.manager [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:04 compute-0 nova_compute[192810]: 2025-09-30 21:45:04.547 2 DEBUG nova.compute.manager [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:04 compute-0 nova_compute[192810]: 2025-09-30 21:45:04.547 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.584 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.605 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.605 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.606 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.606 2 DEBUG nova.network.neutron [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.608 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.609 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:05 compute-0 nova_compute[192810]: 2025-09-30 21:45:05.610 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.521 2 DEBUG nova.network.neutron [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.537 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.542 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.542 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.654 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.656 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.656 2 INFO nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Creating image(s)
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.657 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.670 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.731 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.732 2 DEBUG nova.virt.disk.api [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Checking if we can resize image /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.733 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.789 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.790 2 DEBUG nova.virt.disk.api [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Cannot resize image /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.805 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.805 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Ensure instance console log exists: /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.806 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.806 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.806 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.808 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Start _get_guest_xml network_info=[{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.812 2 WARNING nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.816 2 DEBUG nova.virt.libvirt.host [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.817 2 DEBUG nova.virt.libvirt.host [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.823 2 DEBUG nova.virt.libvirt.host [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.823 2 DEBUG nova.virt.libvirt.host [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.824 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.825 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.825 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.825 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.826 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.826 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.826 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.826 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.826 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.827 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.827 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.827 2 DEBUG nova.virt.hardware [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.827 2 DEBUG nova.objects.instance [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.842 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.896 2 DEBUG oslo_concurrency.processutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.897 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.898 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.898 2 DEBUG oslo_concurrency.lockutils [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.899 2 DEBUG nova.virt.libvirt.vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:45:00Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.900 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.901 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.903 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <uuid>7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</uuid>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <name>instance-00000091</name>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1387484208</nova:name>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:45:07</nova:creationTime>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         <nova:port uuid="c00d1cce-5707-41ba-9ca0-2aeecbf662d8">
Sep 30 21:45:07 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <system>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="serial">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="uuid">7dbd9807-9843-4455-8bf1-3bd7f4fa37c6</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </system>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <os>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </os>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <features>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </features>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/disk.config"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:e7:65:6e"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <target dev="tapc00d1cce-57"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/console.log" append="off"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <video>
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </video>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:45:07 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:45:07 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:45:07 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:45:07 compute-0 nova_compute[192810]: </domain>
Sep 30 21:45:07 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.904 2 DEBUG nova.virt.libvirt.vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:29Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:45:00Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.904 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2107888420", "vif_mac": "fa:16:3e:e7:65:6e"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.905 2 DEBUG nova.network.os_vif_util [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.905 2 DEBUG os_vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.908 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.908 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.912 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc00d1cce-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc00d1cce-57, col_values=(('external_ids', {'iface-id': 'c00d1cce-5707-41ba-9ca0-2aeecbf662d8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:65:6e', 'vm-uuid': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:07 compute-0 NetworkManager[51733]: <info>  [1759268707.9149] manager: (tapc00d1cce-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/247)
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.921 2 INFO os_vif [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.988 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.989 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.989 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] No VIF found with MAC fa:16:3e:e7:65:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:45:07 compute-0 nova_compute[192810]: 2025-09-30 21:45:07.989 2 INFO nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Using config drive
Sep 30 21:45:08 compute-0 kernel: tapc00d1cce-57: entered promiscuous mode
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.0466] manager: (tapc00d1cce-57): new Tun device (/org/freedesktop/NetworkManager/Devices/248)
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 ovn_controller[94912]: 2025-09-30T21:45:08Z|00553|binding|INFO|Claiming lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for this chassis.
Sep 30 21:45:08 compute-0 ovn_controller[94912]: 2025-09-30T21:45:08Z|00554|binding|INFO|c00d1cce-5707-41ba-9ca0-2aeecbf662d8: Claiming fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.061 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '8', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:08 compute-0 ovn_controller[94912]: 2025-09-30T21:45:08Z|00555|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 ovn-installed in OVS
Sep 30 21:45:08 compute-0 ovn_controller[94912]: 2025-09-30T21:45:08Z|00556|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 up in Southbound
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.063 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 bound to our chassis
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.065 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.077 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bff9fd19-9404-4b48-b45c-ea5373d1bd1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.078 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap033e9c33-71 in ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.080 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap033e9c33-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.080 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[106f66f4-e9ed-4ec5-9119-a9d084c395c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.081 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cd487f10-9f31-40ba-9370-772c60244189]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 systemd-udevd[243168]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:45:08 compute-0 systemd-machined[152794]: New machine qemu-72-instance-00000091.
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.093 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[47636c27-02f9-46a7-b42f-02d1473566a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.1051] device (tapc00d1cce-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.1063] device (tapc00d1cce-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:45:08 compute-0 systemd[1]: Started Virtual Machine qemu-72-instance-00000091.
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.117 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f7827bd1-e2b6-435e-899c-960a44f46c41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.150 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[44817a94-4741-4994-871e-1c901c54421b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.1569] manager: (tap033e9c33-70): new Veth device (/org/freedesktop/NetworkManager/Devices/249)
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.156 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7eed3ef5-0294-4d8e-926c-8ed597cdb78b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.186 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[42f1ed94-75ae-41c3-a722-1e0cdcc2430e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.189 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3c2fcbb6-2df0-4992-b163-bcfa99ffac77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.2119] device (tap033e9c33-70): carrier: link connected
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.215 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[214de8da-dbdc-443d-9dae-b65a3b5ab7d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.230 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fd821fc9-1647-49ff-a716-d58faa0ad3aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540383, 'reachable_time': 27287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243201, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.244 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[74b18132-35e6-419a-9812-7bda1edf5e33]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:16ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540383, 'tstamp': 540383}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243202, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.260 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d5c463-c1a1-463d-a366-f6fc1a0bb6c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap033e9c33-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 169], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540383, 'reachable_time': 27287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243203, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.291 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[340225a8-8f91-42e3-997f-39110e1c7047]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.347 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5ae919-540e-45df-9a3d-08da1407bed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.348 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.348 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.348 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap033e9c33-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:08 compute-0 NetworkManager[51733]: <info>  [1759268708.3509] manager: (tap033e9c33-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/250)
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 kernel: tap033e9c33-70: entered promiscuous mode
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.354 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap033e9c33-70, col_values=(('external_ids', {'iface-id': 'c20141c7-d465-400d-879d-d68081c3646d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 ovn_controller[94912]: 2025-09-30T21:45:08Z|00557|binding|INFO|Releasing lport c20141c7-d465-400d-879d-d68081c3646d from this chassis (sb_readonly=0)
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.368 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.368 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2d6f0a-31f9-461e-94d1-ec82be6a9823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.369 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/033e9c33-7065-4faf-8a4b-e2705c450c67.pid.haproxy
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 033e9c33-7065-4faf-8a4b-e2705c450c67
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:45:08 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:08.369 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'env', 'PROCESS_TAG=haproxy-033e9c33-7065-4faf-8a4b-e2705c450c67', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/033e9c33-7065-4faf-8a4b-e2705c450c67.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:08 compute-0 podman[243235]: 2025-09-30 21:45:08.719047 +0000 UTC m=+0.024178094 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.932 2 DEBUG nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.933 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.933 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.933 2 DEBUG oslo_concurrency.lockutils [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.934 2 DEBUG nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:08 compute-0 nova_compute[192810]: 2025-09-30 21:45:08.934 2 WARNING nova.compute.manager [req-6fc3ff4d-bec3-45b3-a916-805b1c5d50c9 req-637fe9b0-150c-403f-901d-1a0d324a0b63 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state active and task_state resize_finish.
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.094 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.095 2 DEBUG nova.network.neutron [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.124 2 DEBUG oslo_concurrency.lockutils [req-5e92efef-ca55-46e6-a45e-8ddf9ad8b3a5 req-a36a00b4-cf38-4241-ae3c-582e15ada7f9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:09 compute-0 podman[243235]: 2025-09-30 21:45:09.318407589 +0000 UTC m=+0.623538673 container create b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:45:09 compute-0 systemd[1]: Started libpod-conmon-b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788.scope.
Sep 30 21:45:09 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:45:09 compute-0 systemd[243015]: Activating special unit Exit the Session...
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped target Main User Target.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped target Basic System.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped target Paths.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped target Sockets.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped target Timers.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:45:09 compute-0 systemd[243015]: Closed D-Bus User Message Bus Socket.
Sep 30 21:45:09 compute-0 systemd[243015]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:45:09 compute-0 systemd[243015]: Removed slice User Application Slice.
Sep 30 21:45:09 compute-0 systemd[243015]: Reached target Shutdown.
Sep 30 21:45:09 compute-0 systemd[243015]: Finished Exit the Session.
Sep 30 21:45:09 compute-0 systemd[243015]: Reached target Exit the Session.
Sep 30 21:45:09 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:45:09 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:45:09 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:45:09 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:45:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/913f81cd0d98aa9892510c6423ae89b030a4c6416a4cd63316d15804e542175f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:45:09 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:45:09 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:45:09 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:45:09 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:45:09 compute-0 podman[243235]: 2025-09-30 21:45:09.51525376 +0000 UTC m=+0.820384854 container init b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:45:09 compute-0 podman[243235]: 2025-09-30 21:45:09.520692133 +0000 UTC m=+0.825823207 container start b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.529 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268709.528896, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.530 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Resumed (Lifecycle Event)
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.531 2 DEBUG nova.compute.manager [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.534 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance running successfully.
Sep 30 21:45:09 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.536 2 DEBUG nova.virt.libvirt.guest [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.536 2 DEBUG nova.virt.libvirt.driver [None req-bea3a0df-6833-4377-a25b-fbdf3785883f 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Sep 30 21:45:09 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [NOTICE]   (243262) : New worker (243264) forked
Sep 30 21:45:09 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [NOTICE]   (243262) : Loading success.
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.550 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.550 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.560 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.563 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.571 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.610 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] During sync_power_state the instance has a pending task (resize_finish). Skip.
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.611 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268709.5312564, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.611 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Started (Lifecycle Event)
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.650 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.653 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.733 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.734 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.742 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.742 2 INFO nova.compute.claims [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.887 2 DEBUG nova.compute.provider_tree [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.905 2 DEBUG nova.scheduler.client.report [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.937 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:09 compute-0 nova_compute[192810]: 2025-09-30 21:45:09.938 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.029 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.030 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.051 2 INFO nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.069 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.190 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.191 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.191 2 INFO nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Creating image(s)
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.192 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.192 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.193 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.206 2 DEBUG nova.policy [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.208 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.279 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.281 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.282 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.296 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.366 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.367 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.751 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk 1073741824" returned: 0 in 0.384s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.752 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.470s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.752 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.821 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.822 2 DEBUG nova.virt.disk.api [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.822 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.890 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.891 2 DEBUG nova.virt.disk.api [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.892 2 DEBUG nova.objects.instance [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid 07d2953c-4702-44dc-9c91-410f1654ccec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.906 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.907 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Ensure instance console log exists: /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.908 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.908 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.908 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.997 2 DEBUG nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 binding to destination host compute-0.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.997 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.998 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:10 compute-0 nova_compute[192810]: 2025-09-30 21:45:10.998 2 DEBUG nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.061 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Successfully created port: 8c4f44ee-8f2b-4451-a1be-90dad356d880 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.075 2 DEBUG nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.075 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.075 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.076 2 DEBUG oslo_concurrency.lockutils [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.076 2 DEBUG nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:11 compute-0 nova_compute[192810]: 2025-09-30 21:45:11.076 2 WARNING nova.compute.manager [req-7ef0532d-0248-434f-8e84-e47125a1dcf0 req-1bf01e60-2de2-49d4-98c8-938a19ce9c49 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.348 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Successfully updated port: 8c4f44ee-8f2b-4451-a1be-90dad356d880 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:45:12 compute-0 podman[243291]: 2025-09-30 21:45:12.358523854 +0000 UTC m=+0.072313115 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.363 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.364 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.364 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:45:12 compute-0 podman[243289]: 2025-09-30 21:45:12.368316123 +0000 UTC m=+0.091543534 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:12 compute-0 podman[243290]: 2025-09-30 21:45:12.368709962 +0000 UTC m=+0.083547019 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.467 2 DEBUG nova.compute.manager [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-changed-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.468 2 DEBUG nova.compute.manager [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Refreshing instance network info cache due to event network-changed-8c4f44ee-8f2b-4451-a1be-90dad356d880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.468 2 DEBUG oslo_concurrency.lockutils [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.542 2 DEBUG nova.network.neutron [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.566 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.571 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.595 2 DEBUG nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Creating tmpfile /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6/tmp0k43l_f0 to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618
Sep 30 21:45:12 compute-0 kernel: tapc00d1cce-57 (unregistering): left promiscuous mode
Sep 30 21:45:12 compute-0 NetworkManager[51733]: <info>  [1759268712.6332] device (tapc00d1cce-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00558|binding|INFO|Releasing lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 from this chassis (sb_readonly=0)
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00559|binding|INFO|Setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 down in Southbound
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00560|binding|INFO|Removing iface tapc00d1cce-57 ovn-installed in OVS
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.658 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '10', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.659 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.661 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.662 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5aaf87d1-dff3-4845-a523-2d3d8100353f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.662 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 namespace which is not needed anymore
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000091.scope: Deactivated successfully.
Sep 30 21:45:12 compute-0 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d00000091.scope: Consumed 4.515s CPU time.
Sep 30 21:45:12 compute-0 systemd-machined[152794]: Machine qemu-72-instance-00000091 terminated.
Sep 30 21:45:12 compute-0 kernel: tapc00d1cce-57: entered promiscuous mode
Sep 30 21:45:12 compute-0 kernel: tapc00d1cce-57 (unregistering): left promiscuous mode
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00561|binding|INFO|Claiming lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for this chassis.
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00562|binding|INFO|c00d1cce-5707-41ba-9ca0-2aeecbf662d8: Claiming fa:16:3e:e7:65:6e 10.100.0.14
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00563|if_status|INFO|Dropped 2 log messages in last 853 seconds (most recently, 853 seconds ago) due to excessive rate
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00564|if_status|INFO|Not setting lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 down as sb is readonly
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.876 2 INFO nova.virt.libvirt.driver [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Instance destroyed successfully.
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.876 2 DEBUG nova.objects.instance [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:12 compute-0 ovn_controller[94912]: 2025-09-30T21:45:12Z|00565|binding|INFO|Releasing lport c00d1cce-5707-41ba-9ca0-2aeecbf662d8 from this chassis (sb_readonly=0)
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.881 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '10', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:12.911 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:65:6e 10.100.0.14'], port_security=['fa:16:3e:e7:65:6e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7dbd9807-9843-4455-8bf1-3bd7f4fa37c6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-033e9c33-7065-4faf-8a4b-e2705c450c67', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '11', 'neutron:security_group_ids': '01fa266f-cdeb-485b-8be3-7c739973b0cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.184', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d7aa739-6d0b-4488-b89a-1cdd190f3109, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=c00d1cce-5707-41ba-9ca0-2aeecbf662d8) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.918 2 DEBUG nova.virt.libvirt.vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1387484208',display_name='tempest-TestNetworkAdvancedServerOps-server-1387484208',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1387484208',id=145,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGR5X4mrP2QSdtYc3f8JfOkFpP2VM6oi0AFFxhCennHtpd7iMpKCrxhgpbTxKW+iQFQ9j7tfzEdahfdLg9KKVDLd5BB9lwZMhpRJKZ+A5gEfm5/7y6uIYpwFPAd9COVo+g==',key_name='tempest-TestNetworkAdvancedServerOps-302836404',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:45:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-be0b95e6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:45:09Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=7dbd9807-9843-4455-8bf1-3bd7f4fa37c6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.919 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.919 2 DEBUG nova.network.os_vif_util [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.920 2 DEBUG os_vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.922 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc00d1cce-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.928 2 INFO os_vif [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:65:6e,bridge_name='br-int',has_traffic_filtering=True,id=c00d1cce-5707-41ba-9ca0-2aeecbf662d8,network=Network(033e9c33-7065-4faf-8a4b-e2705c450c67),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc00d1cce-57')
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.929 2 INFO nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Deleting instance files /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_del
Sep 30 21:45:12 compute-0 nova_compute[192810]: 2025-09-30 21:45:12.934 2 INFO nova.virt.libvirt.driver [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Deletion of /var/lib/nova/instances/7dbd9807-9843-4455-8bf1-3bd7f4fa37c6_del complete
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.037 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.037 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.058 2 DEBUG nova.objects.instance [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.185 2 DEBUG nova.compute.provider_tree [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.199 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.200 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.200 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.200 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.201 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.201 2 WARNING nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-unplugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.201 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.201 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.201 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.202 2 DEBUG oslo_concurrency.lockutils [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.202 2 DEBUG nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.202 2 WARNING nova.compute.manager [req-a56de23d-997c-4d54-a2c3-47e390b104f6 req-8a59e7a1-d0c3-4d22-ad7f-000b81eceb07 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.212 2 DEBUG nova.scheduler.client.report [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [NOTICE]   (243262) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [NOTICE]   (243262) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [WARNING]  (243262) : Exiting Master process...
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [WARNING]  (243262) : Exiting Master process...
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [ALERT]    (243262) : Current worker (243264) exited with code 143 (Terminated)
Sep 30 21:45:13 compute-0 neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67[243257]: [WARNING]  (243262) : All workers exited. Exiting... (0)
Sep 30 21:45:13 compute-0 systemd[1]: libpod-b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788.scope: Deactivated successfully.
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.293 2 DEBUG oslo_concurrency.lockutils [None req-739f786e-8f50-4d77-8f90-2c26e9defcee 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-0 podman[243373]: 2025-09-30 21:45:13.300408438 +0000 UTC m=+0.543080518 container died b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:45:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-913f81cd0d98aa9892510c6423ae89b030a4c6416a4cd63316d15804e542175f-merged.mount: Deactivated successfully.
Sep 30 21:45:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.780 2 DEBUG nova.network.neutron [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Updating instance_info_cache with network_info: [{"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.798 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.798 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Instance network_info: |[{"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.799 2 DEBUG oslo_concurrency.lockutils [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.799 2 DEBUG nova.network.neutron [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Refreshing network info cache for port 8c4f44ee-8f2b-4451-a1be-90dad356d880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.801 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Start _get_guest_xml network_info=[{"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.805 2 WARNING nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.932 2 DEBUG nova.virt.libvirt.host [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.933 2 DEBUG nova.virt.libvirt.host [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.938 2 DEBUG nova.virt.libvirt.host [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.938 2 DEBUG nova.virt.libvirt.host [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.939 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.939 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.940 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.940 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.940 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.940 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.940 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.941 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.941 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.941 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.941 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.941 2 DEBUG nova.virt.hardware [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.944 2 DEBUG nova.virt.libvirt.vif [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:45:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=149,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-810si326',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:45:10Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=07d2953c-4702-44dc-9c91-410f1654ccec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.944 2 DEBUG nova.network.os_vif_util [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.945 2 DEBUG nova.network.os_vif_util [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.946 2 DEBUG nova.objects.instance [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07d2953c-4702-44dc-9c91-410f1654ccec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.957 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <uuid>07d2953c-4702-44dc-9c91-410f1654ccec</uuid>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <name>instance-00000095</name>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186</nova:name>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:45:13</nova:creationTime>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         <nova:port uuid="8c4f44ee-8f2b-4451-a1be-90dad356d880">
Sep 30 21:45:13 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <system>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="serial">07d2953c-4702-44dc-9c91-410f1654ccec</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="uuid">07d2953c-4702-44dc-9c91-410f1654ccec</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </system>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <os>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </os>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <features>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </features>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.config"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:2d:fc:5e"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <target dev="tap8c4f44ee-8f"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/console.log" append="off"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <video>
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </video>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:45:13 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:45:13 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:45:13 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:45:13 compute-0 nova_compute[192810]: </domain>
Sep 30 21:45:13 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.958 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Preparing to wait for external event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.958 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.958 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.958 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.959 2 DEBUG nova.virt.libvirt.vif [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:45:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=149,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-810si326',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:45:10Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=07d2953c-4702-44dc-9c91-410f1654ccec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.959 2 DEBUG nova.network.os_vif_util [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.960 2 DEBUG nova.network.os_vif_util [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.960 2 DEBUG os_vif [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.963 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c4f44ee-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.963 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8c4f44ee-8f, col_values=(('external_ids', {'iface-id': '8c4f44ee-8f2b-4451-a1be-90dad356d880', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:fc:5e', 'vm-uuid': '07d2953c-4702-44dc-9c91-410f1654ccec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-0 NetworkManager[51733]: <info>  [1759268713.9656] manager: (tap8c4f44ee-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:13 compute-0 nova_compute[192810]: 2025-09-30 21:45:13.970 2 INFO os_vif [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f')
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.095 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.096 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.097 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:2d:fc:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.098 2 INFO nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Using config drive
Sep 30 21:45:14 compute-0 podman[243373]: 2025-09-30 21:45:14.340524028 +0000 UTC m=+1.583196078 container cleanup b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:45:14 compute-0 systemd[1]: libpod-conmon-b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788.scope: Deactivated successfully.
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.506 2 INFO nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Creating config drive at /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.config
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.518 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm2ff28ao execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.658 2 DEBUG oslo_concurrency.processutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpm2ff28ao" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:45:14 compute-0 kernel: tap8c4f44ee-8f: entered promiscuous mode
Sep 30 21:45:14 compute-0 NetworkManager[51733]: <info>  [1759268714.7502] manager: (tap8c4f44ee-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/252)
Sep 30 21:45:14 compute-0 ovn_controller[94912]: 2025-09-30T21:45:14Z|00566|binding|INFO|Claiming lport 8c4f44ee-8f2b-4451-a1be-90dad356d880 for this chassis.
Sep 30 21:45:14 compute-0 systemd-udevd[243351]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:14 compute-0 ovn_controller[94912]: 2025-09-30T21:45:14Z|00567|binding|INFO|8c4f44ee-8f2b-4451-a1be-90dad356d880: Claiming fa:16:3e:2d:fc:5e 10.100.0.9
Sep 30 21:45:14 compute-0 NetworkManager[51733]: <info>  [1759268714.7736] device (tap8c4f44ee-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:45:14 compute-0 ovn_controller[94912]: 2025-09-30T21:45:14Z|00568|binding|INFO|Setting lport 8c4f44ee-8f2b-4451-a1be-90dad356d880 ovn-installed in OVS
Sep 30 21:45:14 compute-0 NetworkManager[51733]: <info>  [1759268714.7751] device (tap8c4f44ee-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:14.790 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:fc:5e 10.100.0.9'], port_security=['fa:16:3e:2d:fc:5e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07d2953c-4702-44dc-9c91-410f1654ccec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02e17e31-bee3-4214-8c0d-d336d8499304', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea4dfefc-3776-4359-bdd5-ef1ed99a61d3, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=8c4f44ee-8f2b-4451-a1be-90dad356d880) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:14 compute-0 ovn_controller[94912]: 2025-09-30T21:45:14Z|00569|binding|INFO|Setting lport 8c4f44ee-8f2b-4451-a1be-90dad356d880 up in Southbound
Sep 30 21:45:14 compute-0 systemd-machined[152794]: New machine qemu-73-instance-00000095.
Sep 30 21:45:14 compute-0 systemd[1]: Started Virtual Machine qemu-73-instance-00000095.
Sep 30 21:45:14 compute-0 podman[243423]: 2025-09-30 21:45:14.927153446 +0000 UTC m=+0.567790239 container remove b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:45:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:14.934 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fffebd13-d836-4374-b1a2-35108b3d8bcf]: (4, ('Tue Sep 30 09:45:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788)\nb0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788\nTue Sep 30 09:45:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 (b0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788)\nb0d8e2195cbb58fa674329a44abb99241500457ab0bffb7bb5d3b4d390c8f788\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:14.937 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf054c9-4d0f-48ba-846f-61938b668b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:14.938 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap033e9c33-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:14 compute-0 kernel: tap033e9c33-70: left promiscuous mode
Sep 30 21:45:14 compute-0 nova_compute[192810]: 2025-09-30 21:45:14.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:14 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:14.958 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a96f825c-4379-4eaf-8232-7862af23aa87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.023 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[39922d30-fea6-466f-8d7f-e2d006d98cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.026 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc25421-12ab-4ea9-b0c5-bb5bf070c650]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.046 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5c6aca-54b6-418f-9656-2de4318a2c5d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540376, 'reachable_time': 27695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243463, 'error': None, 'target': 'ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d033e9c33\x2d7065\x2d4faf\x2d8a4b\x2de2705c450c67.mount: Deactivated successfully.
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.049 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-033e9c33-7065-4faf-8a4b-e2705c450c67 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.049 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4850de63-9e78-4164-ada9-7f077fc4c5a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.054 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.055 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.056 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[519323a8-a740-4a67-9fc9-23a32a88a997]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.057 103867 INFO neutron.agent.ovn.metadata.agent [-] Port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 in datapath 033e9c33-7065-4faf-8a4b-e2705c450c67 unbound from our chassis
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.059 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 033e9c33-7065-4faf-8a4b-e2705c450c67, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.059 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[52e39a54-813e-4328-af7c-26514543487a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.060 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 8c4f44ee-8f2b-4451-a1be-90dad356d880 in datapath cad5d4b4-0147-4d5b-8e82-ad8835d4110a unbound from our chassis
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.062 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.077 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fbda82-86c8-4fb2-bc33-bc1e788ed590]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.078 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcad5d4b4-01 in ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.081 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcad5d4b4-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.081 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d0741a-99c1-4d69-b836-ee5e3bf5bf2e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.085 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab40668-f13b-4187-b07c-8b9fcbba4be7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.100 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[123e1e22-e017-4f30-9319-8510e56842ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.129 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[799d7bd9-cb33-42dd-86be-066256c5bc87]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.170 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[34cffa77-a99d-41b2-95f1-122ce23fdc90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 NetworkManager[51733]: <info>  [1759268715.1766] manager: (tapcad5d4b4-00): new Veth device (/org/freedesktop/NetworkManager/Devices/253)
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.178 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a13e15d7-28ff-4931-9e45-62dd7ca7c0e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.213 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c385f065-7017-4224-ae02-15db625c979f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.215 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d00300bc-d7aa-4926-8255-336f5bebd968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 NetworkManager[51733]: <info>  [1759268715.2517] device (tapcad5d4b4-00): carrier: link connected
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.264 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[59da15ae-b5da-4b35-ac2d-dcf531546487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.279 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[90261774-56c9-453d-be14-7038565115ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcad5d4b4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d6:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541087, 'reachable_time': 33872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243488, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.292 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a0355d06-1ece-47b0-a6e9-4a5232342ea8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:d629'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 541087, 'tstamp': 541087}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 243489, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.307 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c727c8-f999-4a0d-9663-f1d2046500dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcad5d4b4-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d6:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 172], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541087, 'reachable_time': 33872, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 243490, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.332 2 DEBUG nova.compute.manager [req-4847dd71-5604-4a05-a9de-22281a5e9ff7 req-4e5ea7a9-4a98-490e-b703-74903b8e2121 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.332 2 DEBUG oslo_concurrency.lockutils [req-4847dd71-5604-4a05-a9de-22281a5e9ff7 req-4e5ea7a9-4a98-490e-b703-74903b8e2121 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.333 2 DEBUG oslo_concurrency.lockutils [req-4847dd71-5604-4a05-a9de-22281a5e9ff7 req-4e5ea7a9-4a98-490e-b703-74903b8e2121 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.333 2 DEBUG oslo_concurrency.lockutils [req-4847dd71-5604-4a05-a9de-22281a5e9ff7 req-4e5ea7a9-4a98-490e-b703-74903b8e2121 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.333 2 DEBUG nova.compute.manager [req-4847dd71-5604-4a05-a9de-22281a5e9ff7 req-4e5ea7a9-4a98-490e-b703-74903b8e2121 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Processing event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.336 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[55e1fabe-0e02-42f5-bd36-a917081cedb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.389 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[63126f64-79db-4890-bbeb-d46e90390bfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.391 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcad5d4b4-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.391 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.391 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcad5d4b4-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:15 compute-0 NetworkManager[51733]: <info>  [1759268715.3940] manager: (tapcad5d4b4-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Sep 30 21:45:15 compute-0 kernel: tapcad5d4b4-00: entered promiscuous mode
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.396 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcad5d4b4-00, col_values=(('external_ids', {'iface-id': '775a2c33-e2fa-46d8-9fd2-dfbcafa64142'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:15 compute-0 ovn_controller[94912]: 2025-09-30T21:45:15Z|00570|binding|INFO|Releasing lport 775a2c33-e2fa-46d8-9fd2-dfbcafa64142 from this chassis (sb_readonly=0)
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.399 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.400 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[082fd6ca-4cad-4960-a913-c1e2c5e2ae09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.401 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.pid.haproxy
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID cad5d4b4-0147-4d5b-8e82-ad8835d4110a
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:45:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:15.402 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'env', 'PROCESS_TAG=haproxy-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cad5d4b4-0147-4d5b-8e82-ad8835d4110a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.426 2 DEBUG nova.compute.manager [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.426 2 DEBUG nova.compute.manager [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing instance network info cache due to event network-changed-c00d1cce-5707-41ba-9ca0-2aeecbf662d8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.426 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.427 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.427 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Refreshing network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.785 2 DEBUG nova.network.neutron [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Updated VIF entry in instance network info cache for port 8c4f44ee-8f2b-4451-a1be-90dad356d880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.785 2 DEBUG nova.network.neutron [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Updating instance_info_cache with network_info: [{"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.802 2 DEBUG oslo_concurrency.lockutils [req-7481cfb1-3308-4f6e-833d-558f7bd8d473 req-2f692acc-62d3-4264-aa1b-244a82abcec9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-07d2953c-4702-44dc-9c91-410f1654ccec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:15 compute-0 podman[243529]: 2025-09-30 21:45:15.741421977 +0000 UTC m=+0.025512524 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.885 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268715.8845952, 07d2953c-4702-44dc-9c91-410f1654ccec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.885 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] VM Started (Lifecycle Event)
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.887 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.890 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.893 2 INFO nova.virt.libvirt.driver [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Instance spawned successfully.
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.893 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.915 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.919 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.919 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.919 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.919 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.920 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.920 2 DEBUG nova.virt.libvirt.driver [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.923 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.964 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.964 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268715.8846989, 07d2953c-4702-44dc-9c91-410f1654ccec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.965 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] VM Paused (Lifecycle Event)
Sep 30 21:45:15 compute-0 nova_compute[192810]: 2025-09-30 21:45:15.996 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.002 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268715.889547, 07d2953c-4702-44dc-9c91-410f1654ccec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.002 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] VM Resumed (Lifecycle Event)
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.028 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.032 2 INFO nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Took 5.84 seconds to spawn the instance on the hypervisor.
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.032 2 DEBUG nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.034 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.062 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.114 2 INFO nova.compute.manager [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Took 6.42 seconds to build instance.
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.134 2 DEBUG oslo_concurrency.lockutils [None req-41bce394-ca7f-40d9-94ae-48aa073bb658 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:16 compute-0 podman[243529]: 2025-09-30 21:45:16.330820853 +0000 UTC m=+0.614911410 container create b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:45:16 compute-0 systemd[1]: Started libpod-conmon-b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8.scope.
Sep 30 21:45:16 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:45:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/077bafdbe378c25084ff7ba7c651d39d2177d89afdaa9f70bd734e15ebb29718/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.742 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updated VIF entry in instance network info cache for port c00d1cce-5707-41ba-9ca0-2aeecbf662d8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.744 2 DEBUG nova.network.neutron [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Updating instance_info_cache with network_info: [{"id": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "address": "fa:16:3e:e7:65:6e", "network": {"id": "033e9c33-7065-4faf-8a4b-e2705c450c67", "bridge": "br-int", "label": "tempest-network-smoke--2107888420", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc00d1cce-57", "ovs_interfaceid": "c00d1cce-5707-41ba-9ca0-2aeecbf662d8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:16 compute-0 nova_compute[192810]: 2025-09-30 21:45:16.757 2 DEBUG oslo_concurrency.lockutils [req-6a230978-0cde-47fa-b71e-cbecef90afe1 req-8312ad85-f1ee-46e2-8762-5f4e20a21489 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-7dbd9807-9843-4455-8bf1-3bd7f4fa37c6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:16 compute-0 podman[243529]: 2025-09-30 21:45:16.767609277 +0000 UTC m=+1.051699844 container init b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:45:16 compute-0 podman[243529]: 2025-09-30 21:45:16.774510445 +0000 UTC m=+1.058601352 container start b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:45:16 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [NOTICE]   (243548) : New worker (243550) forked
Sep 30 21:45:16 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [NOTICE]   (243548) : Loading success.
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.319 2 DEBUG nova.compute.manager [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.319 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.319 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.319 2 DEBUG oslo_concurrency.lockutils [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "7dbd9807-9843-4455-8bf1-3bd7f4fa37c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.320 2 DEBUG nova.compute.manager [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] No waiting events found dispatching network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.320 2 WARNING nova.compute.manager [req-3c65a9bc-9fb1-4040-9ed5-2346232eb5c1 req-d4fdd397-bf29-4898-91e6-2658e9e8640e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Received unexpected event network-vif-plugged-c00d1cce-5707-41ba-9ca0-2aeecbf662d8 for instance with vm_state resized and task_state resize_reverting.
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.401 2 DEBUG nova.compute.manager [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.402 2 DEBUG oslo_concurrency.lockutils [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.402 2 DEBUG oslo_concurrency.lockutils [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.402 2 DEBUG oslo_concurrency.lockutils [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.402 2 DEBUG nova.compute.manager [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] No waiting events found dispatching network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.402 2 WARNING nova.compute.manager [req-df05a2cf-ad4e-47e7-878a-ee56acb684e1 req-3b3313b2-0db7-4603-bff6-7a9b1c8f3301 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received unexpected event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 for instance with vm_state active and task_state None.
Sep 30 21:45:17 compute-0 nova_compute[192810]: 2025-09-30 21:45:17.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:18 compute-0 nova_compute[192810]: 2025-09-30 21:45:18.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:20 compute-0 podman[243560]: 2025-09-30 21:45:20.333500443 +0000 UTC m=+0.060543758 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:45:20 compute-0 podman[243561]: 2025-09-30 21:45:20.34240002 +0000 UTC m=+0.065797466 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Sep 30 21:45:22 compute-0 nova_compute[192810]: 2025-09-30 21:45:22.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:23 compute-0 nova_compute[192810]: 2025-09-30 21:45:23.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:27 compute-0 nova_compute[192810]: 2025-09-30 21:45:27.873 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268712.8727186, 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:27 compute-0 nova_compute[192810]: 2025-09-30 21:45:27.873 2 INFO nova.compute.manager [-] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] VM Stopped (Lifecycle Event)
Sep 30 21:45:27 compute-0 nova_compute[192810]: 2025-09-30 21:45:27.892 2 DEBUG nova.compute.manager [None req-27ddd8bb-b604-444f-9e51-759ea13e9bfe - - - - - -] [instance: 7dbd9807-9843-4455-8bf1-3bd7f4fa37c6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:27 compute-0 nova_compute[192810]: 2025-09-30 21:45:27.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:28 compute-0 podman[243605]: 2025-09-30 21:45:28.344400339 +0000 UTC m=+0.075296388 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 21:45:28 compute-0 podman[243607]: 2025-09-30 21:45:28.352825235 +0000 UTC m=+0.076607770 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:45:28 compute-0 podman[243606]: 2025-09-30 21:45:28.363150386 +0000 UTC m=+0.089796851 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3)
Sep 30 21:45:28 compute-0 nova_compute[192810]: 2025-09-30 21:45:28.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:30 compute-0 sshd-session[243559]: error: kex_exchange_identification: read: Connection timed out
Sep 30 21:45:30 compute-0 sshd-session[243559]: banner exchange: Connection from 113.240.110.90 port 41396: Connection timed out
Sep 30 21:45:32 compute-0 ovn_controller[94912]: 2025-09-30T21:45:32Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:fc:5e 10.100.0.9
Sep 30 21:45:32 compute-0 ovn_controller[94912]: 2025-09-30T21:45:32Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:fc:5e 10.100.0.9
Sep 30 21:45:32 compute-0 nova_compute[192810]: 2025-09-30 21:45:32.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:33 compute-0 nova_compute[192810]: 2025-09-30 21:45:33.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:37 compute-0 nova_compute[192810]: 2025-09-30 21:45:37.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:38.752 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:38.753 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:38.754 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:38 compute-0 nova_compute[192810]: 2025-09-30 21:45:38.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.119 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.119 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.120 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.120 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.120 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.131 2 INFO nova.compute.manager [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Terminating instance
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.144 2 DEBUG nova.compute.manager [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:45:39 compute-0 kernel: tap8c4f44ee-8f (unregistering): left promiscuous mode
Sep 30 21:45:39 compute-0 NetworkManager[51733]: <info>  [1759268739.1873] device (tap8c4f44ee-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 ovn_controller[94912]: 2025-09-30T21:45:39Z|00571|binding|INFO|Releasing lport 8c4f44ee-8f2b-4451-a1be-90dad356d880 from this chassis (sb_readonly=0)
Sep 30 21:45:39 compute-0 ovn_controller[94912]: 2025-09-30T21:45:39Z|00572|binding|INFO|Setting lport 8c4f44ee-8f2b-4451-a1be-90dad356d880 down in Southbound
Sep 30 21:45:39 compute-0 ovn_controller[94912]: 2025-09-30T21:45:39Z|00573|binding|INFO|Removing iface tap8c4f44ee-8f ovn-installed in OVS
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:39.217 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:fc:5e 10.100.0.9'], port_security=['fa:16:3e:2d:fc:5e 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '07d2953c-4702-44dc-9c91-410f1654ccec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02e17e31-bee3-4214-8c0d-d336d8499304', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea4dfefc-3776-4359-bdd5-ef1ed99a61d3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=8c4f44ee-8f2b-4451-a1be-90dad356d880) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:39.218 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 8c4f44ee-8f2b-4451-a1be-90dad356d880 in datapath cad5d4b4-0147-4d5b-8e82-ad8835d4110a unbound from our chassis
Sep 30 21:45:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:39.219 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cad5d4b4-0147-4d5b-8e82-ad8835d4110a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:39.221 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[391a3d43-bab4-484f-9f1a-114b85128d7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:39 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:39.222 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a namespace which is not needed anymore
Sep 30 21:45:39 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000095.scope: Deactivated successfully.
Sep 30 21:45:39 compute-0 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d00000095.scope: Consumed 14.736s CPU time.
Sep 30 21:45:39 compute-0 systemd-machined[152794]: Machine qemu-73-instance-00000095 terminated.
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.411 2 INFO nova.virt.libvirt.driver [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Instance destroyed successfully.
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.412 2 DEBUG nova.objects.instance [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid 07d2953c-4702-44dc-9c91-410f1654ccec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:39 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [NOTICE]   (243548) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:39 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [NOTICE]   (243548) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:39 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [WARNING]  (243548) : Exiting Master process...
Sep 30 21:45:39 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [ALERT]    (243548) : Current worker (243550) exited with code 143 (Terminated)
Sep 30 21:45:39 compute-0 neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a[243544]: [WARNING]  (243548) : All workers exited. Exiting... (0)
Sep 30 21:45:39 compute-0 systemd[1]: libpod-b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8.scope: Deactivated successfully.
Sep 30 21:45:39 compute-0 podman[243709]: 2025-09-30 21:45:39.454386685 +0000 UTC m=+0.155215487 container died b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.457 2 DEBUG nova.virt.libvirt.vif [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:45:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-0-1136351186',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=149,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL1/fVTmGYuOOwxSGbH9N8TvZbqR3p/LXcDidDQZkj1x/r1cncivgRIPmN5OLuNtDCDFO3TM2NYeeoJUUNE27RLr42R4vAhyNzhABLK9jtr/4dNw2wyPqxeNtjp2/KxusQ==',key_name='tempest-TestSecurityGroupsBasicOps-1980363703',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:45:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-810si326',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:45:16Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=07d2953c-4702-44dc-9c91-410f1654ccec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.458 2 DEBUG nova.network.os_vif_util [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "address": "fa:16:3e:2d:fc:5e", "network": {"id": "cad5d4b4-0147-4d5b-8e82-ad8835d4110a", "bridge": "br-int", "label": "tempest-network-smoke--54130567", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8c4f44ee-8f", "ovs_interfaceid": "8c4f44ee-8f2b-4451-a1be-90dad356d880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.460 2 DEBUG nova.network.os_vif_util [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.461 2 DEBUG os_vif [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c4f44ee-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.473 2 INFO os_vif [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:fc:5e,bridge_name='br-int',has_traffic_filtering=True,id=8c4f44ee-8f2b-4451-a1be-90dad356d880,network=Network(cad5d4b4-0147-4d5b-8e82-ad8835d4110a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8c4f44ee-8f')
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.473 2 INFO nova.virt.libvirt.driver [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Deleting instance files /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec_del
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.474 2 INFO nova.virt.libvirt.driver [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Deletion of /var/lib/nova/instances/07d2953c-4702-44dc-9c91-410f1654ccec_del complete
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.544 2 INFO nova.compute.manager [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.545 2 DEBUG oslo.service.loopingcall [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.545 2 DEBUG nova.compute.manager [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.545 2 DEBUG nova.network.neutron [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.642 2 DEBUG nova.compute.manager [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-unplugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.642 2 DEBUG oslo_concurrency.lockutils [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.642 2 DEBUG oslo_concurrency.lockutils [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.643 2 DEBUG oslo_concurrency.lockutils [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.643 2 DEBUG nova.compute.manager [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] No waiting events found dispatching network-vif-unplugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:39 compute-0 nova_compute[192810]: 2025-09-30 21:45:39.643 2 DEBUG nova.compute.manager [req-6fc3b926-5821-49e0-8eb2-12ecab652b38 req-336a9a8c-468e-4bab-80dc-783ee7a25c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-unplugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:45:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-077bafdbe378c25084ff7ba7c651d39d2177d89afdaa9f70bd734e15ebb29718-merged.mount: Deactivated successfully.
Sep 30 21:45:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:39 compute-0 podman[243709]: 2025-09-30 21:45:39.992270335 +0000 UTC m=+0.693099137 container cleanup b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:45:39 compute-0 systemd[1]: libpod-conmon-b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8.scope: Deactivated successfully.
Sep 30 21:45:40 compute-0 podman[243757]: 2025-09-30 21:45:40.184661956 +0000 UTC m=+0.171681047 container remove b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.195 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4753e7-bb39-481e-ad6b-8835cd435975]: (4, ('Tue Sep 30 09:45:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a (b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8)\nb91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8\nTue Sep 30 09:45:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a (b91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8)\nb91e4d4c9514cf01093af4629367a676f76b84916c78e2f19ce2f3bafa274fd8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.197 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[235545b2-2471-44e5-b518-2a3b8295b3aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.198 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcad5d4b4-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:40 compute-0 kernel: tapcad5d4b4-00: left promiscuous mode
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.230 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[337e8a0b-70e3-402a-a92f-e096e8d3f956]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.255 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb3e347-3d3d-4c82-a660-3dfa7de3a578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.256 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a74b217c-8169-43d6-aacd-d1bd1a054fb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.270 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3fc450be-01f5-41ea-a947-dd53cecc731e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 541078, 'reachable_time': 34750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243770, 'error': None, 'target': 'ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.273 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cad5d4b4-0147-4d5b-8e82-ad8835d4110a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:40 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:40.273 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a8486c-1ae1-47d2-a4cf-edad1ea01f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:40 compute-0 systemd[1]: run-netns-ovnmeta\x2dcad5d4b4\x2d0147\x2d4d5b\x2d8e82\x2dad8835d4110a.mount: Deactivated successfully.
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.736 2 DEBUG nova.network.neutron [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.773 2 INFO nova.compute.manager [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Took 1.23 seconds to deallocate network for instance.
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.852 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.853 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.858 2 DEBUG nova.compute.manager [req-15a217d6-e93c-4555-b0ea-067f3168a887 req-4f31d36e-224b-4d88-a9e4-6341ba5bb49e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-deleted-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.916 2 DEBUG nova.compute.provider_tree [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.928 2 DEBUG nova.scheduler.client.report [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.947 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:40 compute-0 nova_compute[192810]: 2025-09-30 21:45:40.971 2 INFO nova.scheduler.client.report [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance 07d2953c-4702-44dc-9c91-410f1654ccec
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.087 2 DEBUG oslo_concurrency.lockutils [None req-1e6a1ee2-7a45-48ba-aad6-b4b676be0233 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.968s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.824 2 DEBUG nova.compute.manager [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.824 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.824 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.825 2 DEBUG oslo_concurrency.lockutils [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "07d2953c-4702-44dc-9c91-410f1654ccec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.825 2 DEBUG nova.compute.manager [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] No waiting events found dispatching network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:41 compute-0 nova_compute[192810]: 2025-09-30 21:45:41.825 2 WARNING nova.compute.manager [req-96d61777-1b1e-49d1-8d42-33a1a02081c7 req-44a8ce32-000c-4a9f-8728-af49d523e741 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Received unexpected event network-vif-plugged-8c4f44ee-8f2b-4451-a1be-90dad356d880 for instance with vm_state deleted and task_state None.
Sep 30 21:45:42 compute-0 nova_compute[192810]: 2025-09-30 21:45:42.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:43 compute-0 podman[243772]: 2025-09-30 21:45:43.326622653 +0000 UTC m=+0.055515605 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:45:43 compute-0 podman[243773]: 2025-09-30 21:45:43.335117551 +0000 UTC m=+0.061328117 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:45:43 compute-0 podman[243771]: 2025-09-30 21:45:43.352027393 +0000 UTC m=+0.081492209 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:45:44 compute-0 nova_compute[192810]: 2025-09-30 21:45:44.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:46 compute-0 nova_compute[192810]: 2025-09-30 21:45:46.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:46.946 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:46.947 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:45:47 compute-0 ovn_controller[94912]: 2025-09-30T21:45:47Z|00574|binding|INFO|Releasing lport 227142f6-4b16-4f5b-a481-3e42b58e84ee from this chassis (sb_readonly=0)
Sep 30 21:45:47 compute-0 ovn_controller[94912]: 2025-09-30T21:45:47Z|00575|binding|INFO|Releasing lport dd6bfcb5-ae09-4e02-b34a-7b6921b33bb7 from this chassis (sb_readonly=0)
Sep 30 21:45:47 compute-0 nova_compute[192810]: 2025-09-30 21:45:47.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:47 compute-0 nova_compute[192810]: 2025-09-30 21:45:47.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:49 compute-0 ovn_controller[94912]: 2025-09-30T21:45:49Z|00576|binding|INFO|Releasing lport 227142f6-4b16-4f5b-a481-3e42b58e84ee from this chassis (sb_readonly=0)
Sep 30 21:45:49 compute-0 ovn_controller[94912]: 2025-09-30T21:45:49Z|00577|binding|INFO|Releasing lport dd6bfcb5-ae09-4e02-b34a-7b6921b33bb7 from this chassis (sb_readonly=0)
Sep 30 21:45:49 compute-0 nova_compute[192810]: 2025-09-30 21:45:49.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:49 compute-0 nova_compute[192810]: 2025-09-30 21:45:49.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:51 compute-0 podman[243833]: 2025-09-30 21:45:51.332680778 +0000 UTC m=+0.069547417 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:45:51 compute-0 podman[243832]: 2025-09-30 21:45:51.343350679 +0000 UTC m=+0.074843537 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:45:51 compute-0 nova_compute[192810]: 2025-09-30 21:45:51.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.208 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.209 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.209 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.209 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.209 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.223 2 INFO nova.compute.manager [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Terminating instance
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.233 2 DEBUG nova.compute.manager [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:45:52 compute-0 kernel: tapd768580b-cf (unregistering): left promiscuous mode
Sep 30 21:45:52 compute-0 NetworkManager[51733]: <info>  [1759268752.2839] device (tapd768580b-cf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00578|binding|INFO|Releasing lport d768580b-cf31-4877-8394-7501915ecf87 from this chassis (sb_readonly=0)
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00579|binding|INFO|Setting lport d768580b-cf31-4877-8394-7501915ecf87 down in Southbound
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00580|binding|INFO|Removing iface tapd768580b-cf ovn-installed in OVS
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.306 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:eb:aa 10.100.0.3'], port_security=['fa:16:3e:b1:eb:aa 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42bc7c38-1ece-4458-bbe7-6a2159483355', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07285f81-9b17-4ece-beed-7a792b1d2f3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4dea503-319e-441b-ba9c-fea1c678426f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=d768580b-cf31-4877-8394-7501915ecf87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.307 103867 INFO neutron.agent.ovn.metadata.agent [-] Port d768580b-cf31-4877-8394-7501915ecf87 in datapath 42bc7c38-1ece-4458-bbe7-6a2159483355 unbound from our chassis
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.308 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42bc7c38-1ece-4458-bbe7-6a2159483355, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.309 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[25838e58-7632-4b27-9e7e-331de2a224ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.310 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355 namespace which is not needed anymore
Sep 30 21:45:52 compute-0 kernel: tap1ce81fa7-f0 (unregistering): left promiscuous mode
Sep 30 21:45:52 compute-0 NetworkManager[51733]: <info>  [1759268752.3265] device (tap1ce81fa7-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00581|binding|INFO|Releasing lport 1ce81fa7-f0cd-41b6-bf59-835593359c9f from this chassis (sb_readonly=0)
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00582|binding|INFO|Setting lport 1ce81fa7-f0cd-41b6-bf59-835593359c9f down in Southbound
Sep 30 21:45:52 compute-0 ovn_controller[94912]: 2025-09-30T21:45:52Z|00583|binding|INFO|Removing iface tap1ce81fa7-f0 ovn-installed in OVS
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:52.344 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:5b:98 2001:db8::f816:3eff:fef0:5b98'], port_security=['fa:16:3e:f0:5b:98 2001:db8::f816:3eff:fef0:5b98'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:5b98/64', 'neutron:device_id': 'b95d1bc0-5bee-4787-b4ff-a0908361ae3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '07285f81-9b17-4ece-beed-7a792b1d2f3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91467be9-be1a-467e-bb76-f834d9445de1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=1ce81fa7-f0cd-41b6-bf59-835593359c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000092.scope: Deactivated successfully.
Sep 30 21:45:52 compute-0 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000092.scope: Consumed 15.511s CPU time.
Sep 30 21:45:52 compute-0 systemd-machined[152794]: Machine qemu-71-instance-00000092 terminated.
Sep 30 21:45:52 compute-0 NetworkManager[51733]: <info>  [1759268752.4526] manager: (tapd768580b-cf): new Tun device (/org/freedesktop/NetworkManager/Devices/255)
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 NetworkManager[51733]: <info>  [1759268752.4703] manager: (tap1ce81fa7-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.511 2 INFO nova.virt.libvirt.driver [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Instance destroyed successfully.
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.512 2 DEBUG nova.objects.instance [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid b95d1bc0-5bee-4787-b4ff-a0908361ae3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.532 2 DEBUG nova.virt.libvirt.vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:44:47Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.533 2 DEBUG nova.network.os_vif_util [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.533 2 DEBUG nova.network.os_vif_util [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.533 2 DEBUG os_vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.535 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd768580b-cf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.542 2 INFO os_vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:eb:aa,bridge_name='br-int',has_traffic_filtering=True,id=d768580b-cf31-4877-8394-7501915ecf87,network=Network(42bc7c38-1ece-4458-bbe7-6a2159483355),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd768580b-cf')
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.543 2 DEBUG nova.virt.libvirt.vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:44:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1273749312',display_name='tempest-TestGettingAddress-server-1273749312',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1273749312',id=146,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCTmN/pNaqmaOiFycAcQSYACgRKuAneZoZAKklPa3C16A/2q/HkgtY0P/DZxd1e1qV4ltimB5etVGHV7nuq0tFOyUUUwlQywxyklzBGPgbQap3YQgrb4ac6Zf8Ctmbv6xQ==',key_name='tempest-TestGettingAddress-2018950633',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:44:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-0sr8r1f4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:44:47Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=b95d1bc0-5bee-4787-b4ff-a0908361ae3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.544 2 DEBUG nova.network.os_vif_util [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.544 2 DEBUG nova.network.os_vif_util [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.545 2 DEBUG os_vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.546 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ce81fa7-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.551 2 INFO os_vif [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:5b:98,bridge_name='br-int',has_traffic_filtering=True,id=1ce81fa7-f0cd-41b6-bf59-835593359c9f,network=Network(2d27a62d-95fd-4cbb-bc4d-20b0757663cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ce81fa7-f0')
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.552 2 INFO nova.virt.libvirt.driver [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Deleting instance files /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e_del
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.552 2 INFO nova.virt.libvirt.driver [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Deletion of /var/lib/nova/instances/b95d1bc0-5bee-4787-b4ff-a0908361ae3e_del complete
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.616 2 DEBUG nova.compute.manager [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-unplugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.617 2 DEBUG oslo_concurrency.lockutils [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.617 2 DEBUG oslo_concurrency.lockutils [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.617 2 DEBUG oslo_concurrency.lockutils [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.617 2 DEBUG nova.compute.manager [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-unplugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.617 2 DEBUG nova.compute.manager [req-93683f58-9428-4973-ad15-02478c8427b8 req-f4a84894-d8e5-4a05-bdd8-a41886d3f61c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-unplugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.654 2 DEBUG nova.compute.manager [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-changed-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.655 2 DEBUG nova.compute.manager [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing instance network info cache due to event network-changed-d768580b-cf31-4877-8394-7501915ecf87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.655 2 DEBUG oslo_concurrency.lockutils [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.655 2 DEBUG oslo_concurrency.lockutils [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.656 2 DEBUG nova.network.neutron [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Refreshing network info cache for port d768580b-cf31-4877-8394-7501915ecf87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.659 2 INFO nova.compute.manager [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.659 2 DEBUG oslo.service.loopingcall [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.660 2 DEBUG nova.compute.manager [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.660 2 DEBUG nova.network.neutron [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [NOTICE]   (242883) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [NOTICE]   (242883) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [WARNING]  (242883) : Exiting Master process...
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [WARNING]  (242883) : Exiting Master process...
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [ALERT]    (242883) : Current worker (242885) exited with code 143 (Terminated)
Sep 30 21:45:52 compute-0 neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355[242879]: [WARNING]  (242883) : All workers exited. Exiting... (0)
Sep 30 21:45:52 compute-0 systemd[1]: libpod-bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61.scope: Deactivated successfully.
Sep 30 21:45:52 compute-0 podman[243903]: 2025-09-30 21:45:52.684899821 +0000 UTC m=+0.278708269 container died bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:45:52 compute-0 nova_compute[192810]: 2025-09-30 21:45:52.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-b25ad471f863014ab41b298be63725593b153ed2def4d4612540a50bddb1fb10-merged.mount: Deactivated successfully.
Sep 30 21:45:53 compute-0 podman[243903]: 2025-09-30 21:45:53.256902983 +0000 UTC m=+0.850711431 container cleanup bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:45:53 compute-0 podman[243963]: 2025-09-30 21:45:53.521909117 +0000 UTC m=+0.242243400 container remove bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:45:53 compute-0 systemd[1]: libpod-conmon-bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61.scope: Deactivated successfully.
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.528 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[702d6697-79ff-477a-9200-6ebe07e14845]: (4, ('Tue Sep 30 09:45:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355 (bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61)\nbffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61\nTue Sep 30 09:45:53 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355 (bffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61)\nbffc8528a68fc6b4f4e4a96793e45abc9826d8a21a38465df7107de226194e61\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.530 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fd61f036-ad71-4e56-9b44-7d268836289a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.531 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42bc7c38-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:53 compute-0 kernel: tap42bc7c38-10: left promiscuous mode
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.548 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c6dada79-8180-433e-8d6f-7c2ac3e3cd20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.575 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[97c6c821-24da-41d3-9465-20798794283c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.576 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6ce84c-a0ae-42bf-9416-5bfce297a676]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.591 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[52b45cac-62c7-4583-bba1-8938fbc859e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538070, 'reachable_time': 35260, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243980, 'error': None, 'target': 'ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.593 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42bc7c38-1ece-4458-bbe7-6a2159483355 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.593 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[0af9a43d-c40e-41ec-aedd-fd92cd3e9b02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.594 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 1ce81fa7-f0cd-41b6-bf59-835593359c9f in datapath 2d27a62d-95fd-4cbb-bc4d-20b0757663cc unbound from our chassis
Sep 30 21:45:53 compute-0 systemd[1]: run-netns-ovnmeta\x2d42bc7c38\x2d1ece\x2d4458\x2dbbe7\x2d6a2159483355.mount: Deactivated successfully.
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.595 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2d27a62d-95fd-4cbb-bc4d-20b0757663cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.596 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eaf6dd04-0267-49a8-b20a-aa11db24399b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:53 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:53.597 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc namespace which is not needed anymore
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.737 2 DEBUG nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-unplugged-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.738 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.738 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.738 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.738 2 DEBUG nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-unplugged-d768580b-cf31-4877-8394-7501915ecf87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.739 2 DEBUG nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-unplugged-d768580b-cf31-4877-8394-7501915ecf87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.739 2 DEBUG nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.739 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.739 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.739 2 DEBUG oslo_concurrency.lockutils [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.740 2 DEBUG nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:53 compute-0 nova_compute[192810]: 2025-09-30 21:45:53.740 2 WARNING nova.compute.manager [req-b16e76b2-6e29-4293-ac7c-8976b54391a1 req-3a5498fd-0a1e-451f-98d9-df6c02a592b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received unexpected event network-vif-plugged-d768580b-cf31-4877-8394-7501915ecf87 for instance with vm_state active and task_state deleting.
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [NOTICE]   (242954) : haproxy version is 2.8.14-c23fe91
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [NOTICE]   (242954) : path to executable is /usr/sbin/haproxy
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [WARNING]  (242954) : Exiting Master process...
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [WARNING]  (242954) : Exiting Master process...
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [ALERT]    (242954) : Current worker (242956) exited with code 143 (Terminated)
Sep 30 21:45:53 compute-0 neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc[242950]: [WARNING]  (242954) : All workers exited. Exiting... (0)
Sep 30 21:45:53 compute-0 systemd[1]: libpod-ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8.scope: Deactivated successfully.
Sep 30 21:45:53 compute-0 conmon[242950]: conmon ac8139dd8a88808f21eb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8.scope/container/memory.events
Sep 30 21:45:53 compute-0 podman[243996]: 2025-09-30 21:45:53.8676553 +0000 UTC m=+0.198117963 container died ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:45:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:45:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-c3e0f4c9ae5bf5c1b577e34ebc9b216ee4352009d4746614076d29deb1bbb6bd-merged.mount: Deactivated successfully.
Sep 30 21:45:54 compute-0 podman[243996]: 2025-09-30 21:45:54.316850106 +0000 UTC m=+0.647313149 container cleanup ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:45:54 compute-0 systemd[1]: libpod-conmon-ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8.scope: Deactivated successfully.
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.411 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268739.4099793, 07d2953c-4702-44dc-9c91-410f1654ccec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.412 2 INFO nova.compute.manager [-] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] VM Stopped (Lifecycle Event)
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.436 2 DEBUG nova.compute.manager [None req-951f9c1b-8959-434a-aa00-4058d6382c36 - - - - - -] [instance: 07d2953c-4702-44dc-9c91-410f1654ccec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:45:54 compute-0 podman[244025]: 2025-09-30 21:45:54.459882344 +0000 UTC m=+0.123432671 container remove ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2)
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.466 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6592714d-5996-481c-8698-4d1556d292c9]: (4, ('Tue Sep 30 09:45:53 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc (ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8)\nac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8\nTue Sep 30 09:45:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc (ac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8)\nac8139dd8a88808f21eb7d021cb4f693f42d63842c7ae996f08449bbb982fec8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.467 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0385c889-becf-450c-b00a-c5e0a5a36ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.468 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2d27a62d-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:54 compute-0 kernel: tap2d27a62d-90: left promiscuous mode
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.484 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0baf952e-45cf-49f9-a79b-9ebcc21f4192]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.522 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[23df6586-04cc-4f02-9f10-7cc4e4876d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.524 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9e45e651-8e2c-4200-be49-1535b774755b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.543 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff34a21-46fc-4f29-81a1-89be034426ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 538206, 'reachable_time': 39485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244040, 'error': None, 'target': 'ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.545 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2d27a62d-95fd-4cbb-bc4d-20b0757663cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:45:54 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:54.545 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[df21b479-857b-4876-b33d-d97b31cd6652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:45:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d2d27a62d\x2d95fd\x2d4cbb\x2dbc4d\x2d20b0757663cc.mount: Deactivated successfully.
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.721 2 DEBUG nova.compute.manager [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.721 2 DEBUG oslo_concurrency.lockutils [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.721 2 DEBUG oslo_concurrency.lockutils [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.722 2 DEBUG oslo_concurrency.lockutils [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.722 2 DEBUG nova.compute.manager [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] No waiting events found dispatching network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:45:54 compute-0 nova_compute[192810]: 2025-09-30 21:45:54.722 2 WARNING nova.compute.manager [req-3a76221a-c3b4-413a-a1ad-28b6f0f7662f req-203757df-aeb8-4b62-b7f5-2a48d39e5d78 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received unexpected event network-vif-plugged-1ce81fa7-f0cd-41b6-bf59-835593359c9f for instance with vm_state active and task_state deleting.
Sep 30 21:45:55 compute-0 nova_compute[192810]: 2025-09-30 21:45:55.648 2 DEBUG nova.network.neutron [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updated VIF entry in instance network info cache for port d768580b-cf31-4877-8394-7501915ecf87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:45:55 compute-0 nova_compute[192810]: 2025-09-30 21:45:55.649 2 DEBUG nova.network.neutron [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [{"id": "d768580b-cf31-4877-8394-7501915ecf87", "address": "fa:16:3e:b1:eb:aa", "network": {"id": "42bc7c38-1ece-4458-bbe7-6a2159483355", "bridge": "br-int", "label": "tempest-network-smoke--145108959", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd768580b-cf", "ovs_interfaceid": "d768580b-cf31-4877-8394-7501915ecf87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "address": "fa:16:3e:f0:5b:98", "network": {"id": "2d27a62d-95fd-4cbb-bc4d-20b0757663cc", "bridge": "br-int", "label": "tempest-network-smoke--589443991", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fef0:5b98", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ce81fa7-f0", "ovs_interfaceid": "1ce81fa7-f0cd-41b6-bf59-835593359c9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:55 compute-0 nova_compute[192810]: 2025-09-30 21:45:55.673 2 DEBUG oslo_concurrency.lockutils [req-a2052364-a8a2-41bf-a76c-3f0b20f290a8 req-13f585c8-b0ec-423e-a2f2-ffb36ec06ada dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-b95d1bc0-5bee-4787-b4ff-a0908361ae3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:45:55 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:45:55.950 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.764 2 DEBUG nova.network.neutron [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.798 2 INFO nova.compute.manager [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Took 4.14 seconds to deallocate network for instance.
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.806 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.874 2 DEBUG nova.compute.manager [req-8176aff3-c36b-465a-a53e-88815938997f req-bf8f7d39-7a0b-42cf-8aa6-d73ad6edce64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-deleted-d768580b-cf31-4877-8394-7501915ecf87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.875 2 DEBUG nova.compute.manager [req-8176aff3-c36b-465a-a53e-88815938997f req-bf8f7d39-7a0b-42cf-8aa6-d73ad6edce64 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Received event network-vif-deleted-1ce81fa7-f0cd-41b6-bf59-835593359c9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.887 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:45:56 compute-0 nova_compute[192810]: 2025-09-30 21:45:56.888 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.007 2 DEBUG nova.compute.provider_tree [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.028 2 DEBUG nova.scheduler.client.report [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.048 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.075 2 INFO nova.scheduler.client.report [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance b95d1bc0-5bee-4787-b4ff-a0908361ae3e
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.171 2 DEBUG oslo_concurrency.lockutils [None req-bf442e01-f84b-4d5a-b932-6857d9571e7b 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "b95d1bc0-5bee-4787-b4ff-a0908361ae3e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.962s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:57 compute-0 nova_compute[192810]: 2025-09-30 21:45:57.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:45:59 compute-0 podman[244042]: 2025-09-30 21:45:59.310181249 +0000 UTC m=+0.049178720 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:45:59 compute-0 podman[244043]: 2025-09-30 21:45:59.310662881 +0000 UTC m=+0.046901345 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:45:59 compute-0 podman[244041]: 2025-09-30 21:45:59.312527026 +0000 UTC m=+0.053759252 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 21:46:00 compute-0 nova_compute[192810]: 2025-09-30 21:46:00.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:00 compute-0 nova_compute[192810]: 2025-09-30 21:46:00.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:46:01 compute-0 nova_compute[192810]: 2025-09-30 21:46:01.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:01 compute-0 nova_compute[192810]: 2025-09-30 21:46:01.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:01 compute-0 nova_compute[192810]: 2025-09-30 21:46:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:01 compute-0 nova_compute[192810]: 2025-09-30 21:46:01.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.843 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.844 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:46:02 compute-0 nova_compute[192810]: 2025-09-30 21:46:02.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.017 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.019 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.22930908203125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.019 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.019 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.101 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.101 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.133 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.154 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.193 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:46:03 compute-0 nova_compute[192810]: 2025-09-30 21:46:03.193 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.198 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.199 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.245 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.323 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.323 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.328 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.328 2 INFO nova.compute.claims [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.479 2 DEBUG nova.compute.provider_tree [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.492 2 DEBUG nova.scheduler.client.report [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.522 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.522 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.576 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.576 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.601 2 INFO nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.616 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.731 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.732 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.733 2 INFO nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating image(s)
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.733 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.733 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.734 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.746 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.803 2 DEBUG nova.policy [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.806 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.807 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.807 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.819 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.876 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:04 compute-0 nova_compute[192810]: 2025-09-30 21:46:04.877 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.028 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk 1073741824" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.029 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.029 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.089 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.091 2 DEBUG nova.virt.disk.api [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.091 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.162 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.163 2 DEBUG nova.virt.disk.api [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.163 2 DEBUG nova.objects.instance [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.174 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.175 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Ensure instance console log exists: /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.175 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.176 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.176 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.193 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.193 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.210 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.210 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.211 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.800 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:05 compute-0 nova_compute[192810]: 2025-09-30 21:46:05.911 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Successfully created port: d7d27cdd-9f0b-467f-901a-08b4834d1496 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.392 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Successfully updated port: d7d27cdd-9f0b-467f-901a-08b4834d1496 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.414 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.415 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.415 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.511 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268752.509221, b95d1bc0-5bee-4787-b4ff-a0908361ae3e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.512 2 INFO nova.compute.manager [-] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] VM Stopped (Lifecycle Event)
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.514 2 DEBUG nova.compute.manager [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.514 2 DEBUG nova.compute.manager [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing instance network info cache due to event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.514 2 DEBUG oslo_concurrency.lockutils [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.532 2 DEBUG nova.compute.manager [None req-bbbc6243-49eb-4ea5-84cb-03e3166b6f1b - - - - - -] [instance: b95d1bc0-5bee-4787-b4ff-a0908361ae3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.581 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:46:07 compute-0 nova_compute[192810]: 2025-09-30 21:46:07.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.529 2 DEBUG nova.network.neutron [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.551 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.551 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance network_info: |[{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.551 2 DEBUG oslo_concurrency.lockutils [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.552 2 DEBUG nova.network.neutron [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.555 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Start _get_guest_xml network_info=[{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.561 2 WARNING nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.569 2 DEBUG nova.virt.libvirt.host [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.570 2 DEBUG nova.virt.libvirt.host [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.575 2 DEBUG nova.virt.libvirt.host [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.575 2 DEBUG nova.virt.libvirt.host [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.577 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.577 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.578 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.578 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.578 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.578 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.579 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.579 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.579 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.579 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.580 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.580 2 DEBUG nova.virt.hardware [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.584 2 DEBUG nova.virt.libvirt.vif [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:04Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.584 2 DEBUG nova.network.os_vif_util [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.585 2 DEBUG nova.network.os_vif_util [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.586 2 DEBUG nova.objects.instance [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.599 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <uuid>c1ffcb24-6b20-4ed8-8287-65fb4e98d88c</uuid>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <name>instance-00000098</name>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-179264408</nova:name>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:46:08</nova:creationTime>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         <nova:port uuid="d7d27cdd-9f0b-467f-901a-08b4834d1496">
Sep 30 21:46:08 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <system>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="serial">c1ffcb24-6b20-4ed8-8287-65fb4e98d88c</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="uuid">c1ffcb24-6b20-4ed8-8287-65fb4e98d88c</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </system>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <os>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </os>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <features>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </features>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:0d:ba:7b"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <target dev="tapd7d27cdd-9f"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/console.log" append="off"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <video>
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </video>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:46:08 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:46:08 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:46:08 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:46:08 compute-0 nova_compute[192810]: </domain>
Sep 30 21:46:08 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.601 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Preparing to wait for external event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.601 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.601 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.602 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.602 2 DEBUG nova.virt.libvirt.vif [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:46:04Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.603 2 DEBUG nova.network.os_vif_util [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.603 2 DEBUG nova.network.os_vif_util [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.604 2 DEBUG os_vif [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.605 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.605 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.608 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d27cdd-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.609 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7d27cdd-9f, col_values=(('external_ids', {'iface-id': 'd7d27cdd-9f0b-467f-901a-08b4834d1496', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:ba:7b', 'vm-uuid': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:08 compute-0 NetworkManager[51733]: <info>  [1759268768.6113] manager: (tapd7d27cdd-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/257)
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.617 2 INFO os_vif [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f')
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.751 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.751 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.751 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:0d:ba:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:46:08 compute-0 nova_compute[192810]: 2025-09-30 21:46:08.752 2 INFO nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Using config drive
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.549 2 INFO nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Creating config drive at /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.554 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0bjzffz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.676 2 DEBUG oslo_concurrency.processutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0bjzffz" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:09 compute-0 kernel: tapd7d27cdd-9f: entered promiscuous mode
Sep 30 21:46:09 compute-0 NetworkManager[51733]: <info>  [1759268769.7311] manager: (tapd7d27cdd-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/258)
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-0 ovn_controller[94912]: 2025-09-30T21:46:09Z|00584|binding|INFO|Claiming lport d7d27cdd-9f0b-467f-901a-08b4834d1496 for this chassis.
Sep 30 21:46:09 compute-0 ovn_controller[94912]: 2025-09-30T21:46:09Z|00585|binding|INFO|d7d27cdd-9f0b-467f-901a-08b4834d1496: Claiming fa:16:3e:0d:ba:7b 10.100.0.4
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.747 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:ba:7b 10.100.0.4'], port_security=['fa:16:3e:0d:ba:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf7d6f50-5bb5-4e27-9140-45f441e5642f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5da0e449-d17e-4f7d-8049-f4559cb24f52, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=d7d27cdd-9f0b-467f-901a-08b4834d1496) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.748 103867 INFO neutron.agent.ovn.metadata.agent [-] Port d7d27cdd-9f0b-467f-901a-08b4834d1496 in datapath 5bda010c-f47e-4b74-9f9e-0682da4daba2 bound to our chassis
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.750 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:09 compute-0 systemd-udevd[244139]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.760 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[040fb5f9-125e-490a-a6c4-2291649a8cc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.761 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5bda010c-f1 in ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.762 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5bda010c-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.763 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[56e8d9fa-9579-485f-9bbb-6fd319e4f897]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.763 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7634bb3c-f64a-4960-8fc5-e8a19846d5a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 NetworkManager[51733]: <info>  [1759268769.7731] device (tapd7d27cdd-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:46:09 compute-0 NetworkManager[51733]: <info>  [1759268769.7740] device (tapd7d27cdd-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:46:09 compute-0 systemd-machined[152794]: New machine qemu-74-instance-00000098.
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.775 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d646b4-0180-49cf-9fae-72897e9c51e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 systemd[1]: Started Virtual Machine qemu-74-instance-00000098.
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.791 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[284aed85-2337-4733-adee-eeacf9e87cc7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-0 ovn_controller[94912]: 2025-09-30T21:46:09Z|00586|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 ovn-installed in OVS
Sep 30 21:46:09 compute-0 ovn_controller[94912]: 2025-09-30T21:46:09Z|00587|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 up in Southbound
Sep 30 21:46:09 compute-0 nova_compute[192810]: 2025-09-30 21:46:09.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.821 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b3740f33-ece1-44a3-9810-65e2529f41e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 NetworkManager[51733]: <info>  [1759268769.8268] manager: (tap5bda010c-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/259)
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.826 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8fa226c9-17ce-4082-bfcf-946bf4f78561]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.863 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0caf35ef-d2a3-45a4-b784-8ff4f21a8e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.868 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c751dc74-f33d-44f8-99ea-5f175f9fc2c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 NetworkManager[51733]: <info>  [1759268769.8943] device (tap5bda010c-f0): carrier: link connected
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.900 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6d378f-6a1b-4ae1-a9b8-a951ffb1f995]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.918 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4933a644-754f-4c3e-b5c9-fc8e5bd528b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda010c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:01:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546551, 'reachable_time': 41048, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244172, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.934 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[28e3c1ba-e8c1-45a3-b5a0-0d9248c71f59]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:15e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546551, 'tstamp': 546551}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244173, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.949 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3f3bbddd-44c8-4269-8bb5-bffdb6d3a6ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5bda010c-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:01:5e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546551, 'reachable_time': 41048, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244174, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:09.981 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c72c8f7f-99ab-423a-ae65-efd870b26368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.032 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[766d18eb-cdfe-4950-86de-aa3ca4529758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.034 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda010c-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.034 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.035 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5bda010c-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:10 compute-0 kernel: tap5bda010c-f0: entered promiscuous mode
Sep 30 21:46:10 compute-0 NetworkManager[51733]: <info>  [1759268770.0373] manager: (tap5bda010c-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/260)
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.039 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5bda010c-f0, col_values=(('external_ids', {'iface-id': '1e783af7-70e3-416e-b2fa-da6490df0917'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:10 compute-0 ovn_controller[94912]: 2025-09-30T21:46:10Z|00588|binding|INFO|Releasing lport 1e783af7-70e3-416e-b2fa-da6490df0917 from this chassis (sb_readonly=0)
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.052 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.053 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[384ed52d-eec4-48f6-90b4-3166687c7099]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.054 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/5bda010c-f47e-4b74-9f9e-0682da4daba2.pid.haproxy
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 5bda010c-f47e-4b74-9f9e-0682da4daba2
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:46:10 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:10.055 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'env', 'PROCESS_TAG=haproxy-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5bda010c-f47e-4b74-9f9e-0682da4daba2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.060 2 DEBUG nova.compute.manager [req-aee8250a-3626-4789-97a8-2b4d46bc8943 req-268eddd5-b358-4ce3-bd37-c68b9daa4fdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.061 2 DEBUG oslo_concurrency.lockutils [req-aee8250a-3626-4789-97a8-2b4d46bc8943 req-268eddd5-b358-4ce3-bd37-c68b9daa4fdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.061 2 DEBUG oslo_concurrency.lockutils [req-aee8250a-3626-4789-97a8-2b4d46bc8943 req-268eddd5-b358-4ce3-bd37-c68b9daa4fdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.061 2 DEBUG oslo_concurrency.lockutils [req-aee8250a-3626-4789-97a8-2b4d46bc8943 req-268eddd5-b358-4ce3-bd37-c68b9daa4fdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.062 2 DEBUG nova.compute.manager [req-aee8250a-3626-4789-97a8-2b4d46bc8943 req-268eddd5-b358-4ce3-bd37-c68b9daa4fdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Processing event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.319 2 DEBUG nova.network.neutron [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updated VIF entry in instance network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.319 2 DEBUG nova.network.neutron [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.348 2 DEBUG oslo_concurrency.lockutils [req-6d9321c3-2b96-4a3b-998c-f1357216a8a8 req-f2f3a2f8-f6be-4b45-b8e6-dfd84a57cc61 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:10 compute-0 podman[244208]: 2025-09-30 21:46:10.390184294 +0000 UTC m=+0.051885856 container create 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:46:10 compute-0 systemd[1]: Started libpod-conmon-1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000.scope.
Sep 30 21:46:10 compute-0 podman[244208]: 2025-09-30 21:46:10.359889765 +0000 UTC m=+0.021591347 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:46:10 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:46:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4445f6268d02deab68fbeb61bdc54140966c12eb40d3a7960a9c7debef465485/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:46:10 compute-0 podman[244208]: 2025-09-30 21:46:10.487924848 +0000 UTC m=+0.149626440 container init 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:46:10 compute-0 podman[244208]: 2025-09-30 21:46:10.493160886 +0000 UTC m=+0.154862448 container start 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:10 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [NOTICE]   (244233) : New worker (244235) forked
Sep 30 21:46:10 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [NOTICE]   (244233) : Loading success.
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.809 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.811 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268770.809494, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.811 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Started (Lifecycle Event)
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.815 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.818 2 INFO nova.virt.libvirt.driver [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance spawned successfully.
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.819 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.844 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.850 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.855 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.855 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.855 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.856 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.856 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.857 2 DEBUG nova.virt.libvirt.driver [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.904 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.904 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268770.8097022, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.905 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Paused (Lifecycle Event)
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.938 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.941 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268770.813059, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.942 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Resumed (Lifecycle Event)
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.962 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.964 2 INFO nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Took 6.23 seconds to spawn the instance on the hypervisor.
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.965 2 DEBUG nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.967 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:10 compute-0 nova_compute[192810]: 2025-09-30 21:46:10.994 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:46:11 compute-0 nova_compute[192810]: 2025-09-30 21:46:11.048 2 INFO nova.compute.manager [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Took 6.75 seconds to build instance.
Sep 30 21:46:11 compute-0 nova_compute[192810]: 2025-09-30 21:46:11.083 2 DEBUG oslo_concurrency.lockutils [None req-721b8bd4-b151-43f9-a384-f6549031975d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.885s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.172 2 DEBUG nova.compute.manager [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.173 2 DEBUG oslo_concurrency.lockutils [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.173 2 DEBUG oslo_concurrency.lockutils [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.173 2 DEBUG oslo_concurrency.lockutils [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.174 2 DEBUG nova.compute.manager [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.174 2 WARNING nova.compute.manager [req-8eb0ed13-b210-4b25-8cde-9644d19ccd35 req-660ef279-b549-4812-8d17-1119d5880718 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state active and task_state None.
Sep 30 21:46:12 compute-0 nova_compute[192810]: 2025-09-30 21:46:12.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:13 compute-0 nova_compute[192810]: 2025-09-30 21:46:13.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:14 compute-0 NetworkManager[51733]: <info>  [1759268774.0438] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Sep 30 21:46:14 compute-0 NetworkManager[51733]: <info>  [1759268774.0452] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:14 compute-0 ovn_controller[94912]: 2025-09-30T21:46:14Z|00589|binding|INFO|Releasing lport 1e783af7-70e3-416e-b2fa-da6490df0917 from this chassis (sb_readonly=0)
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:14 compute-0 podman[244246]: 2025-09-30 21:46:14.312515155 +0000 UTC m=+0.050217836 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:46:14 compute-0 podman[244247]: 2025-09-30 21:46:14.32257436 +0000 UTC m=+0.057695988 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:46:14 compute-0 podman[244245]: 2025-09-30 21:46:14.338904259 +0000 UTC m=+0.076607020 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.516 2 DEBUG nova.compute.manager [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.517 2 DEBUG nova.compute.manager [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing instance network info cache due to event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.517 2 DEBUG oslo_concurrency.lockutils [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.517 2 DEBUG oslo_concurrency.lockutils [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:14 compute-0 nova_compute[192810]: 2025-09-30 21:46:14.517 2 DEBUG nova.network.neutron [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:16 compute-0 nova_compute[192810]: 2025-09-30 21:46:16.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:16 compute-0 nova_compute[192810]: 2025-09-30 21:46:16.570 2 DEBUG nova.network.neutron [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updated VIF entry in instance network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:16 compute-0 nova_compute[192810]: 2025-09-30 21:46:16.571 2 DEBUG nova.network.neutron [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:16 compute-0 nova_compute[192810]: 2025-09-30 21:46:16.595 2 DEBUG oslo_concurrency.lockutils [req-1fb0cc45-837b-4671-af1d-72c9a5c531ad req-cdf2a06b-57d3-4f9c-ae68-4a6d11d47053 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:17 compute-0 nova_compute[192810]: 2025-09-30 21:46:17.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:18 compute-0 nova_compute[192810]: 2025-09-30 21:46:18.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:19 compute-0 nova_compute[192810]: 2025-09-30 21:46:19.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:21.787 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:b1:8a 2001:db8:0:1:f816:3eff:feb2:b18a 2001:db8::f816:3eff:feb2:b18a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb2:b18a/64 2001:db8::f816:3eff:feb2:b18a/64', 'neutron:device_id': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9942446e-c20f-4a7d-bedc-ac08b4f4b886, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4bd0bdb1-40a0-42f8-98d1-c84ba21808c1) old=Port_Binding(mac=['fa:16:3e:b2:b1:8a 2001:db8::f816:3eff:feb2:b18a'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb2:b18a/64', 'neutron:device_id': 'ovnmeta-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d22f103a-1a95-4031-ae6e-c474eae9834e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:21.789 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4bd0bdb1-40a0-42f8-98d1-c84ba21808c1 in datapath d22f103a-1a95-4031-ae6e-c474eae9834e updated
Sep 30 21:46:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:21.790 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d22f103a-1a95-4031-ae6e-c474eae9834e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:21.791 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4d31dd12-8b76-4ee0-afef-9d1891f297e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:22 compute-0 podman[244323]: 2025-09-30 21:46:22.315929078 +0000 UTC m=+0.052438300 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:46:22 compute-0 podman[244324]: 2025-09-30 21:46:22.348456412 +0000 UTC m=+0.083730044 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., name=ubi9-minimal)
Sep 30 21:46:22 compute-0 nova_compute[192810]: 2025-09-30 21:46:22.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:23 compute-0 nova_compute[192810]: 2025-09-30 21:46:23.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:24 compute-0 ovn_controller[94912]: 2025-09-30T21:46:24Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:ba:7b 10.100.0.4
Sep 30 21:46:24 compute-0 ovn_controller[94912]: 2025-09-30T21:46:24Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:ba:7b 10.100.0.4
Sep 30 21:46:27 compute-0 nova_compute[192810]: 2025-09-30 21:46:27.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:28 compute-0 nova_compute[192810]: 2025-09-30 21:46:28.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:29 compute-0 nova_compute[192810]: 2025-09-30 21:46:29.823 2 INFO nova.compute.manager [None req-83eac70d-a582-4d4b-9a23-ff97fc6e5863 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Get console output
Sep 30 21:46:29 compute-0 nova_compute[192810]: 2025-09-30 21:46:29.829 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:46:30 compute-0 podman[244369]: 2025-09-30 21:46:30.319520563 +0000 UTC m=+0.047354456 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:46:30 compute-0 podman[244367]: 2025-09-30 21:46:30.319398 +0000 UTC m=+0.052280936 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:46:30 compute-0 podman[244368]: 2025-09-30 21:46:30.346287516 +0000 UTC m=+0.078959847 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 21:46:30 compute-0 nova_compute[192810]: 2025-09-30 21:46:30.920 2 INFO nova.compute.manager [None req-edba0d0a-6cd6-4c81-ba13-0e65ede87c43 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Get console output
Sep 30 21:46:30 compute-0 nova_compute[192810]: 2025-09-30 21:46:30.924 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:46:32 compute-0 nova_compute[192810]: 2025-09-30 21:46:32.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:33 compute-0 nova_compute[192810]: 2025-09-30 21:46:33.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.549 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Check if temp file /var/lib/nova/instances/tmp83k0dk2g exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.555 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.609 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.610 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.675 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:34 compute-0 nova_compute[192810]: 2025-09-30 21:46:34.677 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Sep 30 21:46:35 compute-0 nova_compute[192810]: 2025-09-30 21:46:35.403 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:35 compute-0 nova_compute[192810]: 2025-09-30 21:46:35.459 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:35 compute-0 nova_compute[192810]: 2025-09-30 21:46:35.460 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:46:35 compute-0 nova_compute[192810]: 2025-09-30 21:46:35.511 2 DEBUG oslo_concurrency.processutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:46:37 compute-0 nova_compute[192810]: 2025-09-30 21:46:37.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:38 compute-0 nova_compute[192810]: 2025-09-30 21:46:38.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:38.753 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:38.754 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:38.754 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:39 compute-0 sshd-session[244442]: Accepted publickey for nova from 192.168.122.101 port 39784 ssh2: ECDSA SHA256:MZb8WjUIxCo1ZPhM/oSWWpmJKsqmELiNET2dwGEt9P4
Sep 30 21:46:39 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 21:46:39 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 21:46:39 compute-0 systemd-logind[792]: New session 40 of user nova.
Sep 30 21:46:39 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 21:46:39 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 21:46:39 compute-0 systemd[244446]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:46:39 compute-0 systemd[244446]: Queued start job for default target Main User Target.
Sep 30 21:46:39 compute-0 systemd[244446]: Created slice User Application Slice.
Sep 30 21:46:39 compute-0 systemd[244446]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:46:39 compute-0 systemd[244446]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 21:46:39 compute-0 systemd[244446]: Reached target Paths.
Sep 30 21:46:39 compute-0 systemd[244446]: Reached target Timers.
Sep 30 21:46:39 compute-0 systemd[244446]: Starting D-Bus User Message Bus Socket...
Sep 30 21:46:39 compute-0 systemd[244446]: Starting Create User's Volatile Files and Directories...
Sep 30 21:46:39 compute-0 systemd[244446]: Listening on D-Bus User Message Bus Socket.
Sep 30 21:46:39 compute-0 systemd[244446]: Reached target Sockets.
Sep 30 21:46:40 compute-0 systemd[244446]: Finished Create User's Volatile Files and Directories.
Sep 30 21:46:40 compute-0 systemd[244446]: Reached target Basic System.
Sep 30 21:46:40 compute-0 systemd[244446]: Reached target Main User Target.
Sep 30 21:46:40 compute-0 systemd[244446]: Startup finished in 162ms.
Sep 30 21:46:40 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 21:46:40 compute-0 systemd[1]: Started Session 40 of User nova.
Sep 30 21:46:40 compute-0 sshd-session[244442]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 21:46:40 compute-0 sshd-session[244461]: Received disconnect from 192.168.122.101 port 39784:11: disconnected by user
Sep 30 21:46:40 compute-0 sshd-session[244461]: Disconnected from user nova 192.168.122.101 port 39784
Sep 30 21:46:40 compute-0 sshd-session[244442]: pam_unix(sshd:session): session closed for user nova
Sep 30 21:46:40 compute-0 systemd[1]: session-40.scope: Deactivated successfully.
Sep 30 21:46:40 compute-0 systemd-logind[792]: Session 40 logged out. Waiting for processes to exit.
Sep 30 21:46:40 compute-0 systemd-logind[792]: Removed session 40.
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.503 2 DEBUG nova.compute.manager [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.505 2 DEBUG oslo_concurrency.lockutils [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.506 2 DEBUG oslo_concurrency.lockutils [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.506 2 DEBUG oslo_concurrency.lockutils [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.506 2 DEBUG nova.compute.manager [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.506 2 DEBUG nova.compute.manager [req-9f14e00e-8106-4ea3-9b97-4f306eb87963 req-17be199e-86bf-4031-8663-cf3c509d7da9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:46:42 compute-0 ovn_controller[94912]: 2025-09-30T21:46:42Z|00590|binding|INFO|Releasing lport 1e783af7-70e3-416e-b2fa-da6490df0917 from this chassis (sb_readonly=0)
Sep 30 21:46:42 compute-0 nova_compute[192810]: 2025-09-30 21:46:42.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.268 2 INFO nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Took 7.76 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.269 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.299 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp83k0dk2g',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c1ffcb24-6b20-4ed8-8287-65fb4e98d88c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(917e2289-d24f-4d71-badb-fcedb6e9a494),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.330 2 DEBUG nova.objects.instance [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lazy-loading 'migration_context' on Instance uuid c1ffcb24-6b20-4ed8-8287-65fb4e98d88c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.332 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.334 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.335 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.350 2 DEBUG nova.virt.libvirt.vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:46:11Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.351 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.351 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.352 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 21:46:43 compute-0 nova_compute[192810]:   <mac address="fa:16:3e:0d:ba:7b"/>
Sep 30 21:46:43 compute-0 nova_compute[192810]:   <model type="virtio"/>
Sep 30 21:46:43 compute-0 nova_compute[192810]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:46:43 compute-0 nova_compute[192810]:   <mtu size="1442"/>
Sep 30 21:46:43 compute-0 nova_compute[192810]:   <target dev="tapd7d27cdd-9f"/>
Sep 30 21:46:43 compute-0 nova_compute[192810]: </interface>
Sep 30 21:46:43 compute-0 nova_compute[192810]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.352 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.838 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.838 2 INFO nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000098', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'hostId': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.922 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.923 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c94f6457-56f8-41cf-92d3-bf5671b02419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.911730', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4ba430e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': 'bccfcc1e25ae11ef14d63a1167efce67aa8fac6e2706ae6d37c20f503bd232c1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.911730', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4ba55b0-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': '35ba1917ecd199fab3fb1dc98efc3ad0a3268d7b3d9d19103015d3439ce0dea9'}]}, 'timestamp': '2025-09-30 21:46:43.923831', '_unique_id': '95f51db102ff4173973e405e2c1d45fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.924 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.941 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.bytes volume: 72962048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.942 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cea0de8-8e71-400a-aff1-4aa1ce0bb180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72962048, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.926284', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4bd1e26-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': 'c17b905f9a221beb5682a7803080fee9276729a843988b5c8697bfe0ceed691f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.926284', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4bd2c9a-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '6905f153ca81736c3e866cf347f7db180d5cae64f9fcfb4ca04b2f355ce08e50'}]}, 'timestamp': '2025-09-30 21:46:43.942432', '_unique_id': 'addf47086ae84c4680991a536f46831c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.943 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.944 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.947 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c1ffcb24-6b20-4ed8-8287-65fb4e98d88c / tapd7d27cdd-9f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.947 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.bytes volume: 10209 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70152a5b-c819-480d-8752-395dee2f32e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10209, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.944466', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4be09bc-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': '36d6c092cf00dfabbb58cfe6063ecbc338354956ab9c7cab78fa2eb0ef1814dc'}]}, 'timestamp': '2025-09-30 21:46:43.948155', '_unique_id': 'd21cda5dc070439493a361d6c65d0b74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.948 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.949 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.949 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fb8a1ac-0bcd-4083-bf09-eb50a352de08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.949728', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4be541c-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'da5024cc069ea24426d0783452cc0087bdda2e63a519b701fc1eaf7a7feea82e'}]}, 'timestamp': '2025-09-30 21:46:43.950030', '_unique_id': '981a3c6a29f843af9c835a085a0c6f72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.950 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.951 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:46:43 compute-0 nova_compute[192810]: 2025-09-30 21:46:43.960 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.965 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/cpu volume: 11640000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55f92196-cc09-48b1-8707-a9c5f98f9372', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11640000000, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'timestamp': '2025-09-30T21:46:43.951581', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f4c0b96e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.652669759, 'message_signature': '4453ff884fd3dad6206e4e4956d7b6a88472f75e8c1dd1b384aca91d9b196a11'}]}, 'timestamp': '2025-09-30 21:46:43.965775', '_unique_id': '9333887466884151a523da4ebb14ddbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.966 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.967 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.967 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.latency volume: 4634528276 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.967 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cc475e4-a8bd-46e5-9377-f9968cbc2262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4634528276, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.967450', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c1090a-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '3d482caa76775c9e69958fc27714714fc7e5c36c5ae1352bd5460f4dea997667'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.967450', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c1140e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '50631bc20b327577d23502781a389860d01ce19a97fba96a300096d0957c6e3e'}]}, 'timestamp': '2025-09-30 21:46:43.968043', '_unique_id': 'e1fcb2fa26c14578a26fdc900a6274af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.968 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.969 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.969 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets volume: 106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7f7cd53-cd7e-4cef-9ecb-05e83faeb4e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 106, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.969609', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c15d56-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': '5e4a70c59590605df8401a25864fb43efb852bd988e46dd15fad50fffcc3fa39'}]}, 'timestamp': '2025-09-30 21:46:43.969923', '_unique_id': 'f27e38a63fe844f0b7d3687fa1f1e0f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.970 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.971 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.971 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.971 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f101b8f-d6a0-4d8c-bcd9-15de12bf901a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.971261', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c19dca-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '4982f392fe62d0967b79e4ba1213d9f02b09f4c5d76e9fc706d479b2c54a9ea5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.971261', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c1a874-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '2762fb005ed8d0cc80fe1ffb39aea6e7ba2c25b11975eb41e3f4474fea35ce1e'}]}, 'timestamp': '2025-09-30 21:46:43.971826', '_unique_id': '6280fd40a8144238a982289748000028'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.972 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.973 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.973 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaed133e-6c89-467e-ba70-7da72998e067', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.973213', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c1e938-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': '4f9bf47550fb8d6e842c8a15842a12e31f167df70b17ce512d743642d14403a8'}]}, 'timestamp': '2025-09-30 21:46:43.973545', '_unique_id': 'e0df3d4d6df44209a5efeaefd9310674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.974 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9c2df63-905b-4df0-965e-657de872bb14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.974901', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c22bfa-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'b921d566fa8023eebab492f007e334ee80bac835418d312cfac0e096fbf8b0de'}]}, 'timestamp': '2025-09-30 21:46:43.975225', '_unique_id': 'c8533b80b10841ed808a93fce5516aaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.975 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.977 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.977 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.977 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.977 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.978 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets volume: 102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfee452d-248a-42a3-8deb-1754455d64de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 102, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.978000', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c2a558-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'a2011ab93fd64d7ee78d372ba853207bf7bf07a67d6da1b38e5ac817cdaf10a0'}]}, 'timestamp': '2025-09-30 21:46:43.978342', '_unique_id': 'ffaff20a458c4b659b9bbcc275b97dc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.979 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.980 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.latency volume: 647025016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.980 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.latency volume: 116775700 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39b7693d-ab2f-47c3-9a82-3d8e3b54fec0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 647025016, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.980418', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c30674-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '68b71edcd8fdfed9073cf823f1b6825608e0b071ff861c448e05f78b58440143'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 116775700, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.980418', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c313a8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '8665d953c6bb44af9bfc6ec98999a09d2ce90a3ffc12159c30ee2d84461a0b08'}]}, 'timestamp': '2025-09-30 21:46:43.981144', '_unique_id': 'b3777f27a97147beb5d5f8a8177560d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.982 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.983 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.983 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.requests volume: 286 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.984 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dec39e0-bfe7-4a10-a0f8-bbacffa1761f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 286, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.983807', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c38c2a-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': 'f40d956302db5ca9b9fc7a7c682ce81719ea4d52f611cb3b01cccb54728afaf7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.983807', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c39c24-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '7cfe10d4d6b1803fe3a2b839451b72f8ec29c7580a110033fc5e0f602ca6870c'}]}, 'timestamp': '2025-09-30 21:46:43.984650', '_unique_id': '32425b17bdfd49b68a02444d183d86a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.985 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.986 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.986 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.987 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.987 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.987 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53257dda-7f93-4341-a2de-ba7eab77f341', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.987436', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c41870-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': '182d95a26235795c89a11bf5f52baba318fa875de3c18031796b295bbebf73af'}]}, 'timestamp': '2025-09-30 21:46:43.987836', '_unique_id': '65797c8f7a954a4b93e702eb60090d09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.988 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.989 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5407b9f4-7767-40b1-8975-fab741932527', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.989923', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c47b26-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'ef35b9d2afe2d4a0fc3bc1dab5c2b50bced997541ccfa1de5a67feb895fe26da'}]}, 'timestamp': '2025-09-30 21:46:43.990417', '_unique_id': '541276c6f79045769985fef13e332fd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.991 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.992 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.992 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.992 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.992 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.992 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.993 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-179264408>]
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.993 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.993 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efb06414-4e58-4c27-92f9-11206fd5e015', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:43.993263', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c4f952-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'bf0611d252e949eef705d0820b10ee8b1fcb5c608f0812924c9c9b5bcdce58c2'}]}, 'timestamp': '2025-09-30 21:46:43.993619', '_unique_id': 'f06f0128f086455fa6acad32c1b6f872'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.994 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.995 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.995 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.bytes volume: 31025664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.995 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9bf6a47a-5c9c-44ea-a5cb-38e806c811f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31025664, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.995167', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c54452-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '3259f3aadad1deaac1659af4023200ddcf1afd7a6d370e9bb16432a7aff81c0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.995167', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c54fd8-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.613802841, 'message_signature': '40efe357f6871939b0dc2d5d61a834f48f3143ecce45ea134011d7fe0012cc01'}]}, 'timestamp': '2025-09-30 21:46:43.995780', '_unique_id': '130a64ab05094f03b076505eaee7bc51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.996 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.997 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.997 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/memory.usage volume: 42.78515625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c7d2063-f1b4-4c3b-af7f-3cd7fa5d6c0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.78515625, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'timestamp': '2025-09-30T21:46:43.997262', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f4c5952e-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.652669759, 'message_signature': 'ace0ae5e76db2ac0224bef61941e2c8ebf0b0c8388d4642f6766221f794c0839'}]}, 'timestamp': '2025-09-30 21:46:43.997581', '_unique_id': '2229474bc25d45d3901509ba2cdcb94b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.998 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.999 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:43.999 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4621e13b-355d-4f00-bb16-d564c6526a52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:43.999009', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c5d958-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': '809d0f5f47aeee3d5576ca7690368b3b1205fc53dbdd4d03e34dff64a3745601'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:43.999009', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c5e484-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': '3a5de133169122012816b792c880b756e39c6ff9044070f9b9c356536cefaff0'}]}, 'timestamp': '2025-09-30 21:46:43.999609', '_unique_id': 'e80c585487494f4694f9346e8422b31b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.000 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.001 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.001 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f0e71e1-1630-488b-a0fe-359a364e4c0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-vda', 'timestamp': '2025-09-30T21:46:44.001052', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f4c6291c-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': 'e7f734d720ef0f675ced3ddd530b98628adb580ad71d5b427edb38bff83c83c6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-sda', 'timestamp': '2025-09-30T21:46:44.001052', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'instance-00000098', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4c63466-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.599226926, 'message_signature': '92fc6753b8f311682e3770066072a2bf13a71659c0b14f2c0f1c2e9270ee28e7'}]}, 'timestamp': '2025-09-30 21:46:44.001647', '_unique_id': 'c77fba727f3046788d8306da4d8966ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.002 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 DEBUG ceilometer.compute.pollsters [-] c1ffcb24-6b20-4ed8-8287-65fb4e98d88c/network.outgoing.bytes volume: 7937 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d49499d-0eea-4dd5-a2d6-a865786d5ce6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7937, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_name': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_name': None, 'resource_id': 'instance-00000098-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-tapd7d27cdd-9f', 'timestamp': '2025-09-30T21:46:44.003073', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-179264408', 'name': 'tapd7d27cdd-9f', 'instance_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'instance_type': 'm1.nano', 'host': '52e45b1d322680f820ff93b9ba92ccdf3eee0711b3aef168050ac4fc', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:ba:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd7d27cdd-9f'}, 'message_id': 'f4c67840-9e46-11f0-a153-fa163e09b122', 'monotonic_time': 5499.632045836, 'message_signature': 'c652498c722aad0675c60585b30f92b3e81beab2735c372981aef02aa08cf6db'}]}, 'timestamp': '2025-09-30 21:46:44.003406', '_unique_id': '550c1e1be5224fe0875f14844f8dc6c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:46:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:46:44.003 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.463 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.464 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.616 2 DEBUG nova.compute.manager [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.616 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.616 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.616 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.617 2 DEBUG nova.compute.manager [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.617 2 WARNING nova.compute.manager [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state active and task_state migrating.
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.617 2 DEBUG nova.compute.manager [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.617 2 DEBUG nova.compute.manager [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing instance network info cache due to event network-changed-d7d27cdd-9f0b-467f-901a-08b4834d1496. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.618 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.618 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.618 2 DEBUG nova.network.neutron [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Refreshing network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.966 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:46:44 compute-0 nova_compute[192810]: 2025-09-30 21:46:44.967 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:46:45 compute-0 podman[244469]: 2025-09-30 21:46:45.343420734 +0000 UTC m=+0.068675526 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:46:45 compute-0 podman[244470]: 2025-09-30 21:46:45.343772133 +0000 UTC m=+0.065480238 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:45 compute-0 podman[244468]: 2025-09-30 21:46:45.368080416 +0000 UTC m=+0.094624209 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.471 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.471 2 DEBUG nova.virt.libvirt.migration [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.595 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268805.594374, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.595 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Paused (Lifecycle Event)
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.620 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.625 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.668 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] During sync_power_state the instance has a pending task (migrating). Skip.
Sep 30 21:46:45 compute-0 kernel: tapd7d27cdd-9f (unregistering): left promiscuous mode
Sep 30 21:46:45 compute-0 NetworkManager[51733]: <info>  [1759268805.7336] device (tapd7d27cdd-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:46:45 compute-0 ovn_controller[94912]: 2025-09-30T21:46:45Z|00591|binding|INFO|Releasing lport d7d27cdd-9f0b-467f-901a-08b4834d1496 from this chassis (sb_readonly=0)
Sep 30 21:46:45 compute-0 ovn_controller[94912]: 2025-09-30T21:46:45Z|00592|binding|INFO|Setting lport d7d27cdd-9f0b-467f-901a-08b4834d1496 down in Southbound
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:45 compute-0 ovn_controller[94912]: 2025-09-30T21:46:45Z|00593|binding|INFO|Removing iface tapd7d27cdd-9f ovn-installed in OVS
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:45.755 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:ba:7b 10.100.0.4'], port_security=['fa:16:3e:0d:ba:7b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '78438f8f-1ac2-4393-90b7-0b62e0665947'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cf7d6f50-5bb5-4e27-9140-45f441e5642f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.221'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5da0e449-d17e-4f7d-8049-f4559cb24f52, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=d7d27cdd-9f0b-467f-901a-08b4834d1496) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:45.757 103867 INFO neutron.agent.ovn.metadata.agent [-] Port d7d27cdd-9f0b-467f-901a-08b4834d1496 in datapath 5bda010c-f47e-4b74-9f9e-0682da4daba2 unbound from our chassis
Sep 30 21:46:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:45.760 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bda010c-f47e-4b74-9f9e-0682da4daba2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:46:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:45.763 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d0cfb405-c787-4157-a982-e754bcdeb2a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:45.765 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 namespace which is not needed anymore
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:45 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Deactivated successfully.
Sep 30 21:46:45 compute-0 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d00000098.scope: Consumed 14.290s CPU time.
Sep 30 21:46:45 compute-0 systemd-machined[152794]: Machine qemu-74-instance-00000098 terminated.
Sep 30 21:46:45 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [NOTICE]   (244233) : haproxy version is 2.8.14-c23fe91
Sep 30 21:46:45 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [NOTICE]   (244233) : path to executable is /usr/sbin/haproxy
Sep 30 21:46:45 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [WARNING]  (244233) : Exiting Master process...
Sep 30 21:46:45 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [ALERT]    (244233) : Current worker (244235) exited with code 143 (Terminated)
Sep 30 21:46:45 compute-0 neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2[244229]: [WARNING]  (244233) : All workers exited. Exiting... (0)
Sep 30 21:46:45 compute-0 systemd[1]: libpod-1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000.scope: Deactivated successfully.
Sep 30 21:46:45 compute-0 podman[244550]: 2025-09-30 21:46:45.961634254 +0000 UTC m=+0.097770646 container died 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.973 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.974 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.974 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.974 2 DEBUG nova.virt.libvirt.guest [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'c1ffcb24-6b20-4ed8-8287-65fb4e98d88c' (instance-00000098) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.974 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migration operation has completed
Sep 30 21:46:45 compute-0 nova_compute[192810]: 2025-09-30 21:46:45.975 2 INFO nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] _post_live_migration() is started..
Sep 30 21:46:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000-userdata-shm.mount: Deactivated successfully.
Sep 30 21:46:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-4445f6268d02deab68fbeb61bdc54140966c12eb40d3a7960a9c7debef465485-merged.mount: Deactivated successfully.
Sep 30 21:46:46 compute-0 podman[244550]: 2025-09-30 21:46:46.301188866 +0000 UTC m=+0.437325228 container cleanup 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:46:46 compute-0 systemd[1]: libpod-conmon-1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000.scope: Deactivated successfully.
Sep 30 21:46:46 compute-0 podman[244598]: 2025-09-30 21:46:46.377851506 +0000 UTC m=+0.044856045 container remove 1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.383 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d1c9ee-d868-4370-a4a4-98465443b316]: (4, ('Tue Sep 30 09:46:45 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 (1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000)\n1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000\nTue Sep 30 09:46:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 (1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000)\n1fbd81f65cf2dd4cc60999a8bbc1ad03a6dacfbd9e513e3ce38bcfc62cc54000\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.384 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cf6a2347-0790-42f1-a2aa-efb98885be52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.385 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5bda010c-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:46 compute-0 kernel: tap5bda010c-f0: left promiscuous mode
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.405 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8955e58a-8a42-4c24-aa23-b185ae9bcdd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.444 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f9d1f2-e507-41e0-a085-e0ad40c20a75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.445 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[60076a6e-ff7b-4736-84ea-feda1872865b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.454 2 DEBUG nova.compute.manager [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.454 2 DEBUG oslo_concurrency.lockutils [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.455 2 DEBUG oslo_concurrency.lockutils [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.455 2 DEBUG oslo_concurrency.lockutils [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.455 2 DEBUG nova.compute.manager [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.455 2 DEBUG nova.compute.manager [req-2b7ee77c-d842-445b-91bc-ad149332cb49 req-0c969d93-32d3-4f1f-a4fd-dedd96e13c15 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-unplugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.460 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[41f6f5df-ec0f-44b6-af2a-b1c450f78401]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546544, 'reachable_time': 28105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244615, 'error': None, 'target': 'ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 systemd[1]: run-netns-ovnmeta\x2d5bda010c\x2df47e\x2d4b74\x2d9f9e\x2d0682da4daba2.mount: Deactivated successfully.
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.462 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5bda010c-f47e-4b74-9f9e-0682da4daba2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:46:46 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:46.462 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[3133cf16-1121-49a0-9beb-c4f3ffc21afb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.869 2 DEBUG nova.network.neutron [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updated VIF entry in instance network info cache for port d7d27cdd-9f0b-467f-901a-08b4834d1496. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.870 2 DEBUG nova.network.neutron [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Updating instance_info_cache with network_info: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:46:46 compute-0 nova_compute[192810]: 2025-09-30 21:46:46.888 2 DEBUG oslo_concurrency.lockutils [req-2be156e0-6850-416b-ae5d-eae57ac6d4e5 req-08b4674c-e120-4a95-a1a0-4658a283bb8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c1ffcb24-6b20-4ed8-8287-65fb4e98d88c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:46:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:47.741 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:46:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:47.742 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.836 2 DEBUG nova.network.neutron [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Activated binding for port d7d27cdd-9f0b-467f-901a-08b4834d1496 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.836 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.837 2 DEBUG nova.virt.libvirt.vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:46:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-179264408',display_name='tempest-TestNetworkAdvancedServerOps-server-179264408',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-179264408',id=152,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOjZFWrOkGLtM22QkbWpGIMaBp+BaIFIIqX3JnKMPx79D7bs+sVkeOWPP9lZzJjeCe2zeve1vj7ZAmUVopE269mNakJeQ6oTVT41jDINH3f5N8CxQSQQns3MXA4tC9JjTQ==',key_name='tempest-TestNetworkAdvancedServerOps-144739839',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:46:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-xa5oqg6r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:46:32Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=c1ffcb24-6b20-4ed8-8287-65fb4e98d88c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.837 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converting VIF {"id": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "address": "fa:16:3e:0d:ba:7b", "network": {"id": "5bda010c-f47e-4b74-9f9e-0682da4daba2", "bridge": "br-int", "label": "tempest-network-smoke--746378667", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.221", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d27cdd-9f", "ovs_interfaceid": "d7d27cdd-9f0b-467f-901a-08b4834d1496", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.838 2 DEBUG nova.network.os_vif_util [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.838 2 DEBUG os_vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.840 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d27cdd-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.846 2 INFO os_vif [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:ba:7b,bridge_name='br-int',has_traffic_filtering=True,id=d7d27cdd-9f0b-467f-901a-08b4834d1496,network=Network(5bda010c-f47e-4b74-9f9e-0682da4daba2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d27cdd-9f')
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.847 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.847 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.847 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.847 2 DEBUG nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.848 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Deleting instance files /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c_del
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.848 2 INFO nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Deletion of /var/lib/nova/instances/c1ffcb24-6b20-4ed8-8287-65fb4e98d88c_del complete
Sep 30 21:46:47 compute-0 nova_compute[192810]: 2025-09-30 21:46:47.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.578 2 DEBUG nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.579 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.579 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.580 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.580 2 DEBUG nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.580 2 WARNING nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state active and task_state migrating.
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.581 2 DEBUG nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.581 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.581 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.582 2 DEBUG oslo_concurrency.lockutils [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.582 2 DEBUG nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:48 compute-0 nova_compute[192810]: 2025-09-30 21:46:48.583 2 WARNING nova.compute.manager [req-e77f512f-0052-4e96-9266-8e9429119638 req-76678d9d-4026-41b3-aa06-1995e2b1bdfa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state active and task_state migrating.
Sep 30 21:46:49 compute-0 nova_compute[192810]: 2025-09-30 21:46:49.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:50 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 21:46:50 compute-0 systemd[244446]: Activating special unit Exit the Session...
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped target Main User Target.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped target Basic System.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped target Paths.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped target Sockets.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped target Timers.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 21:46:50 compute-0 systemd[244446]: Closed D-Bus User Message Bus Socket.
Sep 30 21:46:50 compute-0 systemd[244446]: Stopped Create User's Volatile Files and Directories.
Sep 30 21:46:50 compute-0 systemd[244446]: Removed slice User Application Slice.
Sep 30 21:46:50 compute-0 systemd[244446]: Reached target Shutdown.
Sep 30 21:46:50 compute-0 systemd[244446]: Finished Exit the Session.
Sep 30 21:46:50 compute-0 systemd[244446]: Reached target Exit the Session.
Sep 30 21:46:50 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 21:46:50 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 21:46:50 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 21:46:50 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 21:46:50 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 21:46:50 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 21:46:50 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.700 2 DEBUG nova.compute.manager [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.701 2 DEBUG oslo_concurrency.lockutils [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.701 2 DEBUG oslo_concurrency.lockutils [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.702 2 DEBUG oslo_concurrency.lockutils [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.702 2 DEBUG nova.compute.manager [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] No waiting events found dispatching network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:46:50 compute-0 nova_compute[192810]: 2025-09-30 21:46:50.702 2 WARNING nova.compute.manager [req-c2803538-3de4-47a8-9238-09eab037ffb4 req-da6fe8d1-2703-4cdc-9c25-85b387971205 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Received unexpected event network-vif-plugged-d7d27cdd-9f0b-467f-901a-08b4834d1496 for instance with vm_state active and task_state migrating.
Sep 30 21:46:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:46:52.743 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:46:52 compute-0 nova_compute[192810]: 2025-09-30 21:46:52.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:52 compute-0 nova_compute[192810]: 2025-09-30 21:46:52.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:53 compute-0 podman[244618]: 2025-09-30 21:46:53.325036227 +0000 UTC m=+0.063142841 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Sep 30 21:46:53 compute-0 podman[244617]: 2025-09-30 21:46:53.325124069 +0000 UTC m=+0.057968385 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.920 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.920 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.920 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "c1ffcb24-6b20-4ed8-8287-65fb4e98d88c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.964 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.965 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.965 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:53 compute-0 nova_compute[192810]: 2025-09-30 21:46:53.965 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.120 2 WARNING nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.121 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5683MB free_disk=73.22930145263672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.122 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.122 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.171 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Migration for instance c1ffcb24-6b20-4ed8-8287-65fb4e98d88c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.196 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.240 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Migration 917e2289-d24f-4d71-badb-fcedb6e9a494 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.241 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.241 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.280 2 DEBUG nova.compute.provider_tree [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.302 2 DEBUG nova.scheduler.client.report [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.326 2 DEBUG nova.compute.resource_tracker [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.327 2 DEBUG oslo_concurrency.lockutils [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.345 2 INFO nova.compute.manager [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.436 2 INFO nova.scheduler.client.report [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] Deleted allocation for migration 917e2289-d24f-4d71-badb-fcedb6e9a494
Sep 30 21:46:54 compute-0 nova_compute[192810]: 2025-09-30 21:46:54.436 2 DEBUG nova.virt.libvirt.driver [None req-1b21fd72-d917-4390-8244-a5c687be245e 4ffee83a5c5d43bd80c8ce0beac480a8 2befdacf1f7e46b1879c89240cb6db76 - - default default] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Sep 30 21:46:55 compute-0 nova_compute[192810]: 2025-09-30 21:46:55.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:56 compute-0 nova_compute[192810]: 2025-09-30 21:46:56.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:46:57 compute-0 nova_compute[192810]: 2025-09-30 21:46:57.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:46:57 compute-0 nova_compute[192810]: 2025-09-30 21:46:57.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:00 compute-0 nova_compute[192810]: 2025-09-30 21:47:00.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:00 compute-0 nova_compute[192810]: 2025-09-30 21:47:00.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:47:00 compute-0 nova_compute[192810]: 2025-09-30 21:47:00.971 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268805.9706626, c1ffcb24-6b20-4ed8-8287-65fb4e98d88c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:00 compute-0 nova_compute[192810]: 2025-09-30 21:47:00.972 2 INFO nova.compute.manager [-] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] VM Stopped (Lifecycle Event)
Sep 30 21:47:01 compute-0 podman[244665]: 2025-09-30 21:47:01.310199686 +0000 UTC m=+0.046385443 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:47:01 compute-0 nova_compute[192810]: 2025-09-30 21:47:01.320 2 DEBUG nova.compute.manager [None req-5ce0bf71-e261-493a-9b35-fdbf216a2395 - - - - - -] [instance: c1ffcb24-6b20-4ed8-8287-65fb4e98d88c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:01 compute-0 podman[244664]: 2025-09-30 21:47:01.334061038 +0000 UTC m=+0.073252438 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd)
Sep 30 21:47:01 compute-0 podman[244666]: 2025-09-30 21:47:01.336117768 +0000 UTC m=+0.069795834 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:47:01 compute-0 nova_compute[192810]: 2025-09-30 21:47:01.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.694 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.695 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.714 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.889 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.889 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.899 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:47:02 compute-0 nova_compute[192810]: 2025-09-30 21:47:02.900 2 INFO nova.compute.claims [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.032 2 DEBUG nova.scheduler.client.report [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.077 2 DEBUG nova.scheduler.client.report [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.077 2 DEBUG nova.compute.provider_tree [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.129 2 DEBUG nova.scheduler.client.report [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.185 2 DEBUG nova.scheduler.client.report [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.272 2 DEBUG nova.compute.provider_tree [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.286 2 DEBUG nova.scheduler.client.report [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.324 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.325 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.396 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.396 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.423 2 INFO nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.467 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.608 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.609 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.610 2 INFO nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Creating image(s)
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.610 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.611 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.611 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.624 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.642 2 DEBUG nova.policy [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.680 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.681 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.681 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.692 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.759 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:03 compute-0 nova_compute[192810]: 2025-09-30 21:47:03.760 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.026 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk 1073741824" returned: 0 in 0.266s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.027 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.027 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.089 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.090 2 DEBUG nova.virt.disk.api [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.090 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.161 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.163 2 DEBUG nova.virt.disk.api [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.163 2 DEBUG nova.objects.instance [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid c16f682b-0141-4298-b21c-d374321c2f4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.190 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.191 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Ensure instance console log exists: /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.191 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.192 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.192 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.535 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Successfully created port: 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.810 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.810 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.811 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.811 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.840 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.841 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.841 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:04 compute-0 nova_compute[192810]: 2025-09-30 21:47:04.841 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.001 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.002 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.22957229614258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.002 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.003 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.084 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance c16f682b-0141-4298-b21c-d374321c2f4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.084 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.084 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.166 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.186 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.263 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:47:05 compute-0 nova_compute[192810]: 2025-09-30 21:47:05.264 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:06 compute-0 nova_compute[192810]: 2025-09-30 21:47:06.240 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:06 compute-0 nova_compute[192810]: 2025-09-30 21:47:06.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.478 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Successfully updated port: 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.745 2 DEBUG nova.compute.manager [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.746 2 DEBUG nova.compute.manager [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing instance network info cache due to event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.746 2 DEBUG oslo_concurrency.lockutils [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.747 2 DEBUG oslo_concurrency.lockutils [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.747 2 DEBUG nova.network.neutron [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing network info cache for port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.770 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:07 compute-0 nova_compute[192810]: 2025-09-30 21:47:07.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.112 2 DEBUG nova.network.neutron [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.548 2 DEBUG nova.network.neutron [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.636 2 DEBUG oslo_concurrency.lockutils [req-8c80a431-4f7e-4753-b53a-371e18afce88 req-c43ffd4e-c501-4506-ad8a-91e7be12caae dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.637 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:08 compute-0 nova_compute[192810]: 2025-09-30 21:47:08.637 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:47:09 compute-0 nova_compute[192810]: 2025-09-30 21:47:09.568 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.439 2 DEBUG nova.network.neutron [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.468 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.469 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance network_info: |[{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.471 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Start _get_guest_xml network_info=[{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.476 2 WARNING nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.481 2 DEBUG nova.virt.libvirt.host [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.481 2 DEBUG nova.virt.libvirt.host [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.484 2 DEBUG nova.virt.libvirt.host [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.485 2 DEBUG nova.virt.libvirt.host [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.486 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.486 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.487 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.487 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.487 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.487 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.488 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.488 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.488 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.488 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.488 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.489 2 DEBUG nova.virt.hardware [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.492 2 DEBUG nova.virt.libvirt.vif [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=156,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-p598wpjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:03Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=c16f682b-0141-4298-b21c-d374321c2f4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.493 2 DEBUG nova.network.os_vif_util [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.493 2 DEBUG nova.network.os_vif_util [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.494 2 DEBUG nova.objects.instance [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid c16f682b-0141-4298-b21c-d374321c2f4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.518 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <uuid>c16f682b-0141-4298-b21c-d374321c2f4b</uuid>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <name>instance-0000009c</name>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311</nova:name>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:47:10</nova:creationTime>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         <nova:port uuid="3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6">
Sep 30 21:47:10 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <system>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="serial">c16f682b-0141-4298-b21c-d374321c2f4b</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="uuid">c16f682b-0141-4298-b21c-d374321c2f4b</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </system>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <os>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </os>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <features>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </features>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.config"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:52:d5:60"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <target dev="tap3f61a4f1-d1"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/console.log" append="off"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <video>
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </video>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:47:10 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:47:10 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:47:10 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:47:10 compute-0 nova_compute[192810]: </domain>
Sep 30 21:47:10 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.520 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Preparing to wait for external event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.520 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.520 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.521 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.521 2 DEBUG nova.virt.libvirt.vif [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=156,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-p598wpjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:03Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=c16f682b-0141-4298-b21c-d374321c2f4b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.522 2 DEBUG nova.network.os_vif_util [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.523 2 DEBUG nova.network.os_vif_util [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.523 2 DEBUG os_vif [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.524 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.524 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.527 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f61a4f1-d1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.528 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3f61a4f1-d1, col_values=(('external_ids', {'iface-id': '3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:d5:60', 'vm-uuid': 'c16f682b-0141-4298-b21c-d374321c2f4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:10 compute-0 NetworkManager[51733]: <info>  [1759268830.5305] manager: (tap3f61a4f1-d1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/263)
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.535 2 INFO os_vif [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1')
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.629 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.629 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.629 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:52:d5:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:47:10 compute-0 nova_compute[192810]: 2025-09-30 21:47:10.630 2 INFO nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Using config drive
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.081 2 INFO nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Creating config drive at /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.config
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.085 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn7bplspl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.210 2 DEBUG oslo_concurrency.processutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn7bplspl" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:11 compute-0 kernel: tap3f61a4f1-d1: entered promiscuous mode
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.2645] manager: (tap3f61a4f1-d1): new Tun device (/org/freedesktop/NetworkManager/Devices/264)
Sep 30 21:47:11 compute-0 ovn_controller[94912]: 2025-09-30T21:47:11Z|00594|binding|INFO|Claiming lport 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 for this chassis.
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 ovn_controller[94912]: 2025-09-30T21:47:11Z|00595|binding|INFO|3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6: Claiming fa:16:3e:52:d5:60 10.100.0.6
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 systemd-udevd[244764]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.300 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:d5:60 10.100.0.6'], port_security=['fa:16:3e:52:d5:60 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c16f682b-0141-4298-b21c-d374321c2f4b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b0a7047-e1ba-4a65-996b-36ec918f1716 70c8b6e4-c5d7-427d-b92f-d008f783e5e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d352e746-3e44-45ee-a4e9-724f90959366, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.301 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 in datapath de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a bound to our chassis
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.3041] device (tap3f61a4f1-d1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.305 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.3062] device (tap3f61a4f1-d1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:47:11 compute-0 systemd-machined[152794]: New machine qemu-75-instance-0000009c.
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.316 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b3ea29-783c-4497-926a-1d1cd9633137]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.317 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapde17a3c3-f1 in ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.320 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapde17a3c3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.320 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29c4b703-8d5a-4d15-ae40-8f6dbfcfcc39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.321 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7a51f1fc-3221-48c9-8514-d5bfdb3be732]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 systemd[1]: Started Virtual Machine qemu-75-instance-0000009c.
Sep 30 21:47:11 compute-0 ovn_controller[94912]: 2025-09-30T21:47:11Z|00596|binding|INFO|Setting lport 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 ovn-installed in OVS
Sep 30 21:47:11 compute-0 ovn_controller[94912]: 2025-09-30T21:47:11Z|00597|binding|INFO|Setting lport 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 up in Southbound
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.335 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9cc450-2005-4d9c-9d60-58a916667d50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.360 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4a047e7d-0ce5-4b4c-a2b9-053f15cc3d9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.388 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[2d417b48-48d8-4a90-a98c-6babd567304c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.393 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c265cf-b418-4f9d-a511-bbac802ca01b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.3944] manager: (tapde17a3c3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/265)
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.423 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a708bfe8-8740-404e-a287-6ac1a28e1f40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.426 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c7cd78-d8e3-43a1-8c5c-c3f76959b7d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.4464] device (tapde17a3c3-f0): carrier: link connected
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.451 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[22220cdf-3a5f-4a1c-90dd-85f9ab9abd4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.465 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[de82f1c2-0e67-4afa-a09a-a76bd25ef830]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde17a3c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552707, 'reachable_time': 38081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244800, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.476 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[81dbcf1d-b563-4a92-92b2-e7d2125c65c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:98be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552707, 'tstamp': 552707}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244801, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.488 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5228f8f9-adc0-443e-9d41-ebd4b18ef9d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde17a3c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552707, 'reachable_time': 38081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244802, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.515 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[851ab8ea-bd9d-4f47-ac4e-054973655db2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.569 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42e41041-0fdb-424e-85da-35ab10ce5ab6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.570 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde17a3c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.570 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.570 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde17a3c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:11 compute-0 NetworkManager[51733]: <info>  [1759268831.5727] manager: (tapde17a3c3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Sep 30 21:47:11 compute-0 kernel: tapde17a3c3-f0: entered promiscuous mode
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.575 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde17a3c3-f0, col_values=(('external_ids', {'iface-id': '49357550-70ff-49b5-806e-8d3ad548f4db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:11 compute-0 ovn_controller[94912]: 2025-09-30T21:47:11Z|00598|binding|INFO|Releasing lport 49357550-70ff-49b5-806e-8d3ad548f4db from this chassis (sb_readonly=0)
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.592 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.592 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[baca8ed7-7b7e-48e9-bc30-b81c8d967f08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.593 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a.pid.haproxy
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:47:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:11.594 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'env', 'PROCESS_TAG=haproxy-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.846 2 DEBUG nova.compute.manager [req-cdd2058c-5d50-4265-aa95-178dd308bd96 req-65d6ec13-6b42-4cd9-adb6-700cb04814fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.847 2 DEBUG oslo_concurrency.lockutils [req-cdd2058c-5d50-4265-aa95-178dd308bd96 req-65d6ec13-6b42-4cd9-adb6-700cb04814fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.847 2 DEBUG oslo_concurrency.lockutils [req-cdd2058c-5d50-4265-aa95-178dd308bd96 req-65d6ec13-6b42-4cd9-adb6-700cb04814fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.847 2 DEBUG oslo_concurrency.lockutils [req-cdd2058c-5d50-4265-aa95-178dd308bd96 req-65d6ec13-6b42-4cd9-adb6-700cb04814fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:11 compute-0 nova_compute[192810]: 2025-09-30 21:47:11.848 2 DEBUG nova.compute.manager [req-cdd2058c-5d50-4265-aa95-178dd308bd96 req-65d6ec13-6b42-4cd9-adb6-700cb04814fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Processing event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:47:12 compute-0 podman[244840]: 2025-09-30 21:47:11.933053629 +0000 UTC m=+0.022146161 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.113 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.114 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268832.1127636, c16f682b-0141-4298-b21c-d374321c2f4b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.114 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] VM Started (Lifecycle Event)
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.117 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.120 2 INFO nova.virt.libvirt.driver [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance spawned successfully.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.120 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:47:12 compute-0 podman[244840]: 2025-09-30 21:47:12.129401239 +0000 UTC m=+0.218493741 container create cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:47:12 compute-0 systemd[1]: Started libpod-conmon-cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd.scope.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.253 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.264 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.269 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.269 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.270 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.270 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.271 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.271 2 DEBUG nova.virt.libvirt.driver [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:12 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:47:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4daaae3682cb3ada703946b12ae0e324fe50961d55d28f84de0eb2d6598bd98e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.317 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.318 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268832.113739, c16f682b-0141-4298-b21c-d374321c2f4b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.318 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] VM Paused (Lifecycle Event)
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.374 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.379 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268832.1163607, c16f682b-0141-4298-b21c-d374321c2f4b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.379 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] VM Resumed (Lifecycle Event)
Sep 30 21:47:12 compute-0 podman[244840]: 2025-09-30 21:47:12.422485357 +0000 UTC m=+0.511577859 container init cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:47:12 compute-0 podman[244840]: 2025-09-30 21:47:12.429628202 +0000 UTC m=+0.518720704 container start cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923)
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.433 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.437 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:12 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [NOTICE]   (244860) : New worker (244862) forked
Sep 30 21:47:12 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [NOTICE]   (244860) : Loading success.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.482 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.497 2 INFO nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Took 8.89 seconds to spawn the instance on the hypervisor.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.498 2 DEBUG nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.831 2 INFO nova.compute.manager [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Took 10.05 seconds to build instance.
Sep 30 21:47:12 compute-0 nova_compute[192810]: 2025-09-30 21:47:12.856 2 DEBUG oslo_concurrency.lockutils [None req-5ad6c74f-6b8f-4cc0-a5d7-a535c5b73acd c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.952 2 DEBUG nova.compute.manager [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.952 2 DEBUG oslo_concurrency.lockutils [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.952 2 DEBUG oslo_concurrency.lockutils [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.953 2 DEBUG oslo_concurrency.lockutils [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.953 2 DEBUG nova.compute.manager [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] No waiting events found dispatching network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:13 compute-0 nova_compute[192810]: 2025-09-30 21:47:13.953 2 WARNING nova.compute.manager [req-23438952-8cf9-4d1f-a593-728307d38313 req-f72628e1-a930-4b24-b7f2-92f2d5eed164 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received unexpected event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 for instance with vm_state active and task_state None.
Sep 30 21:47:15 compute-0 nova_compute[192810]: 2025-09-30 21:47:15.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:16 compute-0 podman[244872]: 2025-09-30 21:47:16.322820102 +0000 UTC m=+0.054187043 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:47:16 compute-0 podman[244873]: 2025-09-30 21:47:16.332961309 +0000 UTC m=+0.063516750 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:47:16 compute-0 podman[244871]: 2025-09-30 21:47:16.357751104 +0000 UTC m=+0.092708413 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:47:18 compute-0 nova_compute[192810]: 2025-09-30 21:47:18.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:18 compute-0 NetworkManager[51733]: <info>  [1759268838.4657] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/267)
Sep 30 21:47:18 compute-0 NetworkManager[51733]: <info>  [1759268838.4666] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/268)
Sep 30 21:47:18 compute-0 nova_compute[192810]: 2025-09-30 21:47:18.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:18 compute-0 nova_compute[192810]: 2025-09-30 21:47:18.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:18 compute-0 ovn_controller[94912]: 2025-09-30T21:47:18Z|00599|binding|INFO|Releasing lport 49357550-70ff-49b5-806e-8d3ad548f4db from this chassis (sb_readonly=0)
Sep 30 21:47:18 compute-0 nova_compute[192810]: 2025-09-30 21:47:18.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:18 compute-0 nova_compute[192810]: 2025-09-30 21:47:18.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.200 2 DEBUG nova.compute.manager [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.200 2 DEBUG nova.compute.manager [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing instance network info cache due to event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.201 2 DEBUG oslo_concurrency.lockutils [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.201 2 DEBUG oslo_concurrency.lockutils [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.201 2 DEBUG nova.network.neutron [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing network info cache for port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:19 compute-0 nova_compute[192810]: 2025-09-30 21:47:19.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:20 compute-0 nova_compute[192810]: 2025-09-30 21:47:20.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:21 compute-0 nova_compute[192810]: 2025-09-30 21:47:21.176 2 DEBUG nova.network.neutron [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updated VIF entry in instance network info cache for port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:21 compute-0 nova_compute[192810]: 2025-09-30 21:47:21.177 2 DEBUG nova.network.neutron [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:21 compute-0 nova_compute[192810]: 2025-09-30 21:47:21.208 2 DEBUG oslo_concurrency.lockutils [req-aab874f0-93ae-4566-a864-b5deb8eb226f req-bcc2785c-fbc3-4a41-b073-82dcb604d7b4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:23 compute-0 nova_compute[192810]: 2025-09-30 21:47:23.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:23 compute-0 nova_compute[192810]: 2025-09-30 21:47:23.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:24 compute-0 nova_compute[192810]: 2025-09-30 21:47:24.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:24 compute-0 podman[244942]: 2025-09-30 21:47:24.338772719 +0000 UTC m=+0.064049683 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:47:24 compute-0 podman[244943]: 2025-09-30 21:47:24.342621593 +0000 UTC m=+0.066217086 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public)
Sep 30 21:47:25 compute-0 nova_compute[192810]: 2025-09-30 21:47:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:25 compute-0 ovn_controller[94912]: 2025-09-30T21:47:25Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:d5:60 10.100.0.6
Sep 30 21:47:25 compute-0 ovn_controller[94912]: 2025-09-30T21:47:25Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:d5:60 10.100.0.6
Sep 30 21:47:28 compute-0 nova_compute[192810]: 2025-09-30 21:47:28.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:30 compute-0 nova_compute[192810]: 2025-09-30 21:47:30.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:32 compute-0 podman[244995]: 2025-09-30 21:47:32.320306718 +0000 UTC m=+0.050665567 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:47:32 compute-0 podman[244994]: 2025-09-30 21:47:32.320726728 +0000 UTC m=+0.053696160 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:47:32 compute-0 podman[244993]: 2025-09-30 21:47:32.329848611 +0000 UTC m=+0.064635038 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd)
Sep 30 21:47:33 compute-0 nova_compute[192810]: 2025-09-30 21:47:33.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:35 compute-0 nova_compute[192810]: 2025-09-30 21:47:35.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-0 nova_compute[192810]: 2025-09-30 21:47:38.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:38.755 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:38.756 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.049 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.050 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.073 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.235 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.236 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.248 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.248 2 INFO nova.compute.claims [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.391 2 DEBUG nova.compute.provider_tree [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.408 2 DEBUG nova.scheduler.client.report [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.434 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.434 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.494 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.495 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.517 2 INFO nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.537 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.684 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.685 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.686 2 INFO nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Creating image(s)
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.686 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.687 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.687 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.705 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.766 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.767 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.768 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.783 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.805 2 DEBUG nova.policy [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.840 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:40 compute-0 nova_compute[192810]: 2025-09-30 21:47:40.841 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.097 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk 1073741824" returned: 0 in 0.256s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.098 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.099 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.167 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.168 2 DEBUG nova.virt.disk.api [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.169 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.229 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.230 2 DEBUG nova.virt.disk.api [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.230 2 DEBUG nova.objects.instance [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.253 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.253 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Ensure instance console log exists: /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.254 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.254 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:41 compute-0 nova_compute[192810]: 2025-09-30 21:47:41.254 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.008 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Successfully created port: 5f6f316e-fa72-471a-b66a-0d8be2032711 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.814 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Successfully updated port: 5f6f316e-fa72-471a-b66a-0d8be2032711 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.829 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.829 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.830 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.931 2 DEBUG nova.compute.manager [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.932 2 DEBUG nova.compute.manager [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing instance network info cache due to event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:42 compute-0 nova_compute[192810]: 2025-09-30 21:47:42.932 2 DEBUG oslo_concurrency.lockutils [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:43 compute-0 nova_compute[192810]: 2025-09-30 21:47:43.049 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:47:43 compute-0 nova_compute[192810]: 2025-09-30 21:47:43.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.003 2 DEBUG nova.network.neutron [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updating instance_info_cache with network_info: [{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.022 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.023 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Instance network_info: |[{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.023 2 DEBUG oslo_concurrency.lockutils [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.023 2 DEBUG nova.network.neutron [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.026 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Start _get_guest_xml network_info=[{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.030 2 WARNING nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.054 2 DEBUG nova.virt.libvirt.host [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.055 2 DEBUG nova.virt.libvirt.host [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.060 2 DEBUG nova.virt.libvirt.host [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.061 2 DEBUG nova.virt.libvirt.host [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.062 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.063 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.063 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.063 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.063 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.064 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.064 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.064 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.064 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.064 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.065 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.065 2 DEBUG nova.virt.hardware [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.068 2 DEBUG nova.virt.libvirt.vif [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=160,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0hhugji1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:40Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=ac9a07b2-30ac-48eb-aa5a-1ca2dd051899,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.069 2 DEBUG nova.network.os_vif_util [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.069 2 DEBUG nova.network.os_vif_util [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.070 2 DEBUG nova.objects.instance [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.090 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <uuid>ac9a07b2-30ac-48eb-aa5a-1ca2dd051899</uuid>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <name>instance-000000a0</name>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230</nova:name>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:47:44</nova:creationTime>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         <nova:port uuid="5f6f316e-fa72-471a-b66a-0d8be2032711">
Sep 30 21:47:44 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <system>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="serial">ac9a07b2-30ac-48eb-aa5a-1ca2dd051899</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="uuid">ac9a07b2-30ac-48eb-aa5a-1ca2dd051899</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </system>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <os>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </os>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <features>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </features>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.config"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:dd:92:4b"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <target dev="tap5f6f316e-fa"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/console.log" append="off"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <video>
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </video>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:47:44 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:47:44 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:47:44 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:47:44 compute-0 nova_compute[192810]: </domain>
Sep 30 21:47:44 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.092 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Preparing to wait for external event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.092 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.092 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.092 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.093 2 DEBUG nova.virt.libvirt.vif [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:47:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=160,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0hhugji1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:47:40Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=ac9a07b2-30ac-48eb-aa5a-1ca2dd051899,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.093 2 DEBUG nova.network.os_vif_util [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.094 2 DEBUG nova.network.os_vif_util [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.094 2 DEBUG os_vif [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f6f316e-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5f6f316e-fa, col_values=(('external_ids', {'iface-id': '5f6f316e-fa72-471a-b66a-0d8be2032711', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:92:4b', 'vm-uuid': 'ac9a07b2-30ac-48eb-aa5a-1ca2dd051899'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:44 compute-0 NetworkManager[51733]: <info>  [1759268864.1014] manager: (tap5f6f316e-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/269)
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.109 2 INFO os_vif [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa')
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.231 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.231 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.232 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:dd:92:4b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.232 2 INFO nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Using config drive
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.966 2 INFO nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Creating config drive at /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.config
Sep 30 21:47:44 compute-0 nova_compute[192810]: 2025-09-30 21:47:44.971 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy7ztgr47 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.103 2 DEBUG oslo_concurrency.processutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy7ztgr47" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:47:45 compute-0 NetworkManager[51733]: <info>  [1759268865.1772] manager: (tap5f6f316e-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Sep 30 21:47:45 compute-0 kernel: tap5f6f316e-fa: entered promiscuous mode
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-0 ovn_controller[94912]: 2025-09-30T21:47:45Z|00600|binding|INFO|Claiming lport 5f6f316e-fa72-471a-b66a-0d8be2032711 for this chassis.
Sep 30 21:47:45 compute-0 ovn_controller[94912]: 2025-09-30T21:47:45Z|00601|binding|INFO|5f6f316e-fa72-471a-b66a-0d8be2032711: Claiming fa:16:3e:dd:92:4b 10.100.0.12
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.194 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:92:4b 10.100.0.12'], port_security=['fa:16:3e:dd:92:4b 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ac9a07b2-30ac-48eb-aa5a-1ca2dd051899', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4b0a7047-e1ba-4a65-996b-36ec918f1716', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d352e746-3e44-45ee-a4e9-724f90959366, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=5f6f316e-fa72-471a-b66a-0d8be2032711) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.195 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 5f6f316e-fa72-471a-b66a-0d8be2032711 in datapath de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a bound to our chassis
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.197 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a
Sep 30 21:47:45 compute-0 systemd-udevd[245089]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:47:45 compute-0 ovn_controller[94912]: 2025-09-30T21:47:45Z|00602|binding|INFO|Setting lport 5f6f316e-fa72-471a-b66a-0d8be2032711 up in Southbound
Sep 30 21:47:45 compute-0 ovn_controller[94912]: 2025-09-30T21:47:45Z|00603|binding|INFO|Setting lport 5f6f316e-fa72-471a-b66a-0d8be2032711 ovn-installed in OVS
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.215 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[358c69e8-29dc-4a5e-bec4-4784c5fe2fd4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-0 NetworkManager[51733]: <info>  [1759268865.2287] device (tap5f6f316e-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:47:45 compute-0 systemd-machined[152794]: New machine qemu-76-instance-000000a0.
Sep 30 21:47:45 compute-0 NetworkManager[51733]: <info>  [1759268865.2294] device (tap5f6f316e-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:47:45 compute-0 systemd[1]: Started Virtual Machine qemu-76-instance-000000a0.
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.245 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a40bdbc3-456c-423c-9e5e-3e4b0270f42d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.248 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d6488d8a-0025-4521-a279-a49a4005f472]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.277 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[81cd967a-ef8b-42c8-be5f-c419a9c6377f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.293 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a5a060-bd78-4b66-9342-6ce22669a122]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde17a3c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552707, 'reachable_time': 38081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245099, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.310 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[56abe6b4-494b-49b3-9c66-122245b571b3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapde17a3c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552716, 'tstamp': 552716}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245103, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapde17a3c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552718, 'tstamp': 552718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245103, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.312 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde17a3c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-0 nova_compute[192810]: 2025-09-30 21:47:45.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde17a3c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde17a3c3-f0, col_values=(('external_ids', {'iface-id': '49357550-70ff-49b5-806e-8d3ad548f4db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:45 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:45.316 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.553 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268866.5530903, ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.554 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] VM Started (Lifecycle Event)
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.580 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.584 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268866.554032, ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.584 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] VM Paused (Lifecycle Event)
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.615 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.618 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.652 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.707 2 DEBUG nova.network.neutron [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updated VIF entry in instance network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.708 2 DEBUG nova.network.neutron [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updating instance_info_cache with network_info: [{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:46 compute-0 nova_compute[192810]: 2025-09-30 21:47:46.733 2 DEBUG oslo_concurrency.lockutils [req-85242589-5863-4b73-96b0-7af847f6cd84 req-80bebed1-ba5f-43d3-bbba-4c51c08c74a7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:47 compute-0 podman[245113]: 2025-09-30 21:47:47.321399014 +0000 UTC m=+0.057869532 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent)
Sep 30 21:47:47 compute-0 podman[245114]: 2025-09-30 21:47:47.328529498 +0000 UTC m=+0.064785831 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:47:47 compute-0 podman[245112]: 2025-09-30 21:47:47.349384227 +0000 UTC m=+0.085310132 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 21:47:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:47.808 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:47:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:47.809 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:47:47 compute-0 nova_compute[192810]: 2025-09-30 21:47:47.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.144 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.145 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.145 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.145 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.146 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Processing event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.146 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.146 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.146 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.147 2 DEBUG oslo_concurrency.lockutils [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.147 2 DEBUG nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] No waiting events found dispatching network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.147 2 WARNING nova.compute.manager [req-bece0134-063e-4507-9b10-71567da48f45 req-a9a5c374-e7f1-4cb6-bccf-1237fafdb8de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received unexpected event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 for instance with vm_state building and task_state spawning.
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.147 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.151 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268868.150849, ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.151 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] VM Resumed (Lifecycle Event)
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.153 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.156 2 INFO nova.virt.libvirt.driver [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Instance spawned successfully.
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.156 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.198 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.203 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.210 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.210 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.211 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.211 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.212 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.212 2 DEBUG nova.virt.libvirt.driver [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.248 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.304 2 INFO nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Took 7.62 seconds to spawn the instance on the hypervisor.
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.304 2 DEBUG nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.394 2 INFO nova.compute.manager [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Took 8.23 seconds to build instance.
Sep 30 21:47:48 compute-0 nova_compute[192810]: 2025-09-30 21:47:48.439 2 DEBUG oslo_concurrency.lockutils [None req-2934c17a-92f9-4f4c-90e6-792b3ee66d81 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:47:49 compute-0 nova_compute[192810]: 2025-09-30 21:47:49.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:47:50.811 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.997 2 DEBUG nova.compute.manager [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.998 2 DEBUG nova.compute.manager [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing instance network info cache due to event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.998 2 DEBUG oslo_concurrency.lockutils [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.999 2 DEBUG oslo_concurrency.lockutils [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:53 compute-0 nova_compute[192810]: 2025-09-30 21:47:53.999 2 DEBUG nova.network.neutron [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:54 compute-0 nova_compute[192810]: 2025-09-30 21:47:54.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:55 compute-0 podman[245175]: 2025-09-30 21:47:55.308474249 +0000 UTC m=+0.049280333 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:47:55 compute-0 podman[245176]: 2025-09-30 21:47:55.310753874 +0000 UTC m=+0.049565570 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Sep 30 21:47:55 compute-0 nova_compute[192810]: 2025-09-30 21:47:55.757 2 DEBUG nova.network.neutron [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updated VIF entry in instance network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:55 compute-0 nova_compute[192810]: 2025-09-30 21:47:55.758 2 DEBUG nova.network.neutron [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updating instance_info_cache with network_info: [{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:55 compute-0 nova_compute[192810]: 2025-09-30 21:47:55.776 2 DEBUG oslo_concurrency.lockutils [req-8e8e6404-cee2-406d-a7f7-35999b33091f req-d0c844c9-a998-4e15-be14-66020f3ee676 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:56 compute-0 nova_compute[192810]: 2025-09-30 21:47:56.098 2 DEBUG nova.compute.manager [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:47:56 compute-0 nova_compute[192810]: 2025-09-30 21:47:56.098 2 DEBUG nova.compute.manager [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing instance network info cache due to event network-changed-5f6f316e-fa72-471a-b66a-0d8be2032711. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:47:56 compute-0 nova_compute[192810]: 2025-09-30 21:47:56.098 2 DEBUG oslo_concurrency.lockutils [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:47:56 compute-0 nova_compute[192810]: 2025-09-30 21:47:56.099 2 DEBUG oslo_concurrency.lockutils [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:47:56 compute-0 nova_compute[192810]: 2025-09-30 21:47:56.099 2 DEBUG nova.network.neutron [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Refreshing network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:47:57 compute-0 nova_compute[192810]: 2025-09-30 21:47:57.859 2 DEBUG nova.network.neutron [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updated VIF entry in instance network info cache for port 5f6f316e-fa72-471a-b66a-0d8be2032711. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:47:57 compute-0 nova_compute[192810]: 2025-09-30 21:47:57.860 2 DEBUG nova.network.neutron [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updating instance_info_cache with network_info: [{"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:47:57 compute-0 nova_compute[192810]: 2025-09-30 21:47:57.884 2 DEBUG oslo_concurrency.lockutils [req-a30610a8-f4ab-4918-8477-06314c943906 req-ab2f3965-b2a2-4943-ae65-2e3c16ebd85c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:47:58 compute-0 nova_compute[192810]: 2025-09-30 21:47:58.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:47:58 compute-0 nova_compute[192810]: 2025-09-30 21:47:58.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:47:59 compute-0 nova_compute[192810]: 2025-09-30 21:47:59.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:00 compute-0 ovn_controller[94912]: 2025-09-30T21:48:00Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:92:4b 10.100.0.12
Sep 30 21:48:00 compute-0 ovn_controller[94912]: 2025-09-30T21:48:00Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:92:4b 10.100.0.12
Sep 30 21:48:01 compute-0 ovn_controller[94912]: 2025-09-30T21:48:01Z|00604|binding|INFO|Releasing lport 49357550-70ff-49b5-806e-8d3ad548f4db from this chassis (sb_readonly=0)
Sep 30 21:48:01 compute-0 nova_compute[192810]: 2025-09-30 21:48:01.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:01 compute-0 nova_compute[192810]: 2025-09-30 21:48:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:01 compute-0 nova_compute[192810]: 2025-09-30 21:48:01.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:48:01 compute-0 nova_compute[192810]: 2025-09-30 21:48:01.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:03 compute-0 nova_compute[192810]: 2025-09-30 21:48:03.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:03 compute-0 podman[245240]: 2025-09-30 21:48:03.313363217 +0000 UTC m=+0.045710186 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:48:03 compute-0 podman[245239]: 2025-09-30 21:48:03.324568021 +0000 UTC m=+0.058891078 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:48:03 compute-0 podman[245238]: 2025-09-30 21:48:03.370341547 +0000 UTC m=+0.097265163 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:48:04 compute-0 nova_compute[192810]: 2025-09-30 21:48:04.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:05 compute-0 nova_compute[192810]: 2025-09-30 21:48:05.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:05 compute-0 nova_compute[192810]: 2025-09-30 21:48:05.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.858 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.858 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.859 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.859 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.859 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.868 2 INFO nova.compute.manager [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Terminating instance
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.878 2 DEBUG nova.compute.manager [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:48:06 compute-0 kernel: tap5f6f316e-fa (unregistering): left promiscuous mode
Sep 30 21:48:06 compute-0 NetworkManager[51733]: <info>  [1759268886.8983] device (tap5f6f316e-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:48:06 compute-0 ovn_controller[94912]: 2025-09-30T21:48:06Z|00605|binding|INFO|Releasing lport 5f6f316e-fa72-471a-b66a-0d8be2032711 from this chassis (sb_readonly=0)
Sep 30 21:48:06 compute-0 ovn_controller[94912]: 2025-09-30T21:48:06Z|00606|binding|INFO|Setting lport 5f6f316e-fa72-471a-b66a-0d8be2032711 down in Southbound
Sep 30 21:48:06 compute-0 ovn_controller[94912]: 2025-09-30T21:48:06Z|00607|binding|INFO|Removing iface tap5f6f316e-fa ovn-installed in OVS
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.914 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:92:4b 10.100.0.12', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'ac9a07b2-30ac-48eb-aa5a-1ca2dd051899', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d352e746-3e44-45ee-a4e9-724f90959366, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=5f6f316e-fa72-471a-b66a-0d8be2032711) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.915 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 5f6f316e-fa72-471a-b66a-0d8be2032711 in datapath de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a unbound from our chassis
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.916 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.931 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b33ae6b2-a33f-4fe6-930b-cb9692b3bbd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.957 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1023028c-519e-4580-aef3-30932ac1507b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.960 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe6c710-b6d4-4c18-80bf-07a69ac8d6cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:06 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a0.scope: Deactivated successfully.
Sep 30 21:48:06 compute-0 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a0.scope: Consumed 12.971s CPU time.
Sep 30 21:48:06 compute-0 systemd-machined[152794]: Machine qemu-76-instance-000000a0 terminated.
Sep 30 21:48:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:06.985 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc2a104-79a6-425b-9c81-0d51e8f09357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.993 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.993 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.993 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:48:06 compute-0 nova_compute[192810]: 2025-09-30 21:48:06.993 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid c16f682b-0141-4298-b21c-d374321c2f4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.004 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[88b00858-4643-4c27-931c-c3a9ef8bf5fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde17a3c3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1670, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1670, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552707, 'reachable_time': 38081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 15, 'inoctets': 1208, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 15, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1208, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 15, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245312, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.020 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[afe3703f-fcbc-415a-b8cc-efe8f29f7d8a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapde17a3c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552716, 'tstamp': 552716}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245313, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapde17a3c3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552718, 'tstamp': 552718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245313, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.022 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde17a3c3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.028 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde17a3c3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.028 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.029 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde17a3c3-f0, col_values=(('external_ids', {'iface-id': '49357550-70ff-49b5-806e-8d3ad548f4db'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:07.029 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.135 2 INFO nova.virt.libvirt.driver [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Instance destroyed successfully.
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.136 2 DEBUG nova.objects.instance [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.163 2 DEBUG nova.virt.libvirt.vif [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:47:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-gen-1-586542230',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ge',id=160,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:47:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-0hhugji1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:47:48Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=ac9a07b2-30ac-48eb-aa5a-1ca2dd051899,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.163 2 DEBUG nova.network.os_vif_util [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "5f6f316e-fa72-471a-b66a-0d8be2032711", "address": "fa:16:3e:dd:92:4b", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5f6f316e-fa", "ovs_interfaceid": "5f6f316e-fa72-471a-b66a-0d8be2032711", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.164 2 DEBUG nova.network.os_vif_util [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.164 2 DEBUG os_vif [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.166 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f6f316e-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.170 2 INFO os_vif [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:92:4b,bridge_name='br-int',has_traffic_filtering=True,id=5f6f316e-fa72-471a-b66a-0d8be2032711,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5f6f316e-fa')
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.171 2 INFO nova.virt.libvirt.driver [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Deleting instance files /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899_del
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.171 2 INFO nova.virt.libvirt.driver [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Deletion of /var/lib/nova/instances/ac9a07b2-30ac-48eb-aa5a-1ca2dd051899_del complete
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.246 2 INFO nova.compute.manager [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.247 2 DEBUG oslo.service.loopingcall [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.247 2 DEBUG nova.compute.manager [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.247 2 DEBUG nova.network.neutron [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.322 2 DEBUG nova.compute.manager [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-unplugged-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.323 2 DEBUG oslo_concurrency.lockutils [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.323 2 DEBUG oslo_concurrency.lockutils [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.324 2 DEBUG oslo_concurrency.lockutils [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.324 2 DEBUG nova.compute.manager [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] No waiting events found dispatching network-vif-unplugged-5f6f316e-fa72-471a-b66a-0d8be2032711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:07 compute-0 nova_compute[192810]: 2025-09-30 21:48:07.324 2 DEBUG nova.compute.manager [req-2c3310a3-b282-4dd2-b430-58f399d25f0b req-d5340cbb-a2da-4657-a6fe-483e83e8bdc1 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-unplugged-5f6f316e-fa72-471a-b66a-0d8be2032711 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.183 2 DEBUG nova.network.neutron [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.204 2 INFO nova.compute.manager [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Took 0.96 seconds to deallocate network for instance.
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.317 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.318 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.430 2 DEBUG nova.compute.provider_tree [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.444 2 DEBUG nova.scheduler.client.report [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.471 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.517 2 INFO nova.scheduler.client.report [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance ac9a07b2-30ac-48eb-aa5a-1ca2dd051899
Sep 30 21:48:08 compute-0 nova_compute[192810]: 2025-09-30 21:48:08.626 2 DEBUG oslo_concurrency.lockutils [None req-22c8ee9c-0ceb-4d49-81e7-2995fa50f6af c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.767s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.457 2 DEBUG nova.compute.manager [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.458 2 DEBUG oslo_concurrency.lockutils [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.458 2 DEBUG oslo_concurrency.lockutils [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.458 2 DEBUG oslo_concurrency.lockutils [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "ac9a07b2-30ac-48eb-aa5a-1ca2dd051899-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.458 2 DEBUG nova.compute.manager [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] No waiting events found dispatching network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.459 2 WARNING nova.compute.manager [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received unexpected event network-vif-plugged-5f6f316e-fa72-471a-b66a-0d8be2032711 for instance with vm_state deleted and task_state None.
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.459 2 DEBUG nova.compute.manager [req-dc0df89f-7e32-4162-b18a-a8ef2123b4da req-30a20ccd-1759-4819-a23a-a9440f5c556f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Received event network-vif-deleted-5f6f316e-fa72-471a-b66a-0d8be2032711 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.642 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.659 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.659 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.659 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.660 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.679 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.679 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.679 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.679 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.748 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.848 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.849 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:09 compute-0 nova_compute[192810]: 2025-09-30 21:48:09.901 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.064 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.065 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5475MB free_disk=73.20115280151367GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.065 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.066 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.153 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance c16f682b-0141-4298-b21c-d374321c2f4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.154 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.154 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.202 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.216 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.261 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:48:10 compute-0 nova_compute[192810]: 2025-09-30 21:48:10.262 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.257 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.587 2 DEBUG nova.compute.manager [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.588 2 DEBUG nova.compute.manager [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing instance network info cache due to event network-changed-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.588 2 DEBUG oslo_concurrency.lockutils [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.588 2 DEBUG oslo_concurrency.lockutils [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.589 2 DEBUG nova.network.neutron [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Refreshing network info cache for port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.745 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.746 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.746 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.746 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.747 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.759 2 INFO nova.compute.manager [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Terminating instance
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.775 2 DEBUG nova.compute.manager [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:48:11 compute-0 kernel: tap3f61a4f1-d1 (unregistering): left promiscuous mode
Sep 30 21:48:11 compute-0 NetworkManager[51733]: <info>  [1759268891.8000] device (tap3f61a4f1-d1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:11 compute-0 ovn_controller[94912]: 2025-09-30T21:48:11Z|00608|binding|INFO|Releasing lport 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 from this chassis (sb_readonly=0)
Sep 30 21:48:11 compute-0 ovn_controller[94912]: 2025-09-30T21:48:11Z|00609|binding|INFO|Setting lport 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 down in Southbound
Sep 30 21:48:11 compute-0 ovn_controller[94912]: 2025-09-30T21:48:11Z|00610|binding|INFO|Removing iface tap3f61a4f1-d1 ovn-installed in OVS
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:11.820 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:d5:60 10.100.0.6'], port_security=['fa:16:3e:52:d5:60 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c16f682b-0141-4298-b21c-d374321c2f4b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4b0a7047-e1ba-4a65-996b-36ec918f1716 70c8b6e4-c5d7-427d-b92f-d008f783e5e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d352e746-3e44-45ee-a4e9-724f90959366, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:11.821 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 in datapath de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a unbound from our chassis
Sep 30 21:48:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:11.822 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:48:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:11.823 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[857536f7-e5c8-4779-a647-4db36e8355cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:11 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:11.823 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a namespace which is not needed anymore
Sep 30 21:48:11 compute-0 nova_compute[192810]: 2025-09-30 21:48:11.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:11 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009c.scope: Deactivated successfully.
Sep 30 21:48:11 compute-0 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d0000009c.scope: Consumed 15.613s CPU time.
Sep 30 21:48:11 compute-0 systemd-machined[152794]: Machine qemu-75-instance-0000009c terminated.
Sep 30 21:48:11 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [NOTICE]   (244860) : haproxy version is 2.8.14-c23fe91
Sep 30 21:48:11 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [NOTICE]   (244860) : path to executable is /usr/sbin/haproxy
Sep 30 21:48:11 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [ALERT]    (244860) : Current worker (244862) exited with code 143 (Terminated)
Sep 30 21:48:11 compute-0 neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a[244856]: [WARNING]  (244860) : All workers exited. Exiting... (0)
Sep 30 21:48:11 compute-0 systemd[1]: libpod-cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd.scope: Deactivated successfully.
Sep 30 21:48:11 compute-0 podman[245363]: 2025-09-30 21:48:11.953589973 +0000 UTC m=+0.043965683 container died cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:48:11 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd-userdata-shm.mount: Deactivated successfully.
Sep 30 21:48:11 compute-0 systemd[1]: var-lib-containers-storage-overlay-4daaae3682cb3ada703946b12ae0e324fe50961d55d28f84de0eb2d6598bd98e-merged.mount: Deactivated successfully.
Sep 30 21:48:11 compute-0 podman[245363]: 2025-09-30 21:48:11.995016663 +0000 UTC m=+0.085392363 container cleanup cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Sep 30 21:48:12 compute-0 systemd[1]: libpod-conmon-cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd.scope: Deactivated successfully.
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.038 2 INFO nova.virt.libvirt.driver [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Instance destroyed successfully.
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.039 2 DEBUG nova.objects.instance [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid c16f682b-0141-4298-b21c-d374321c2f4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:12 compute-0 podman[245400]: 2025-09-30 21:48:12.059909806 +0000 UTC m=+0.041294788 container remove cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.066 2 DEBUG nova.virt.libvirt.vif [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:47:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1264516311',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=156,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4+MYuJWLfNBVP0wpYYzJxFrenLU5zgnS4SMasXyiRNm0dScaN7o0VPG4pxRsoU9vaxfEh15PDw/b0q/pvNoO7EJ1m5Y6fW708LJtHheq7BeanpBXBHMJIyw9csPp7m6g==',key_name='tempest-TestSecurityGroupsBasicOps-376900167',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:47:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-p598wpjp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:47:12Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=c16f682b-0141-4298-b21c-d374321c2f4b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.066 2 DEBUG nova.network.os_vif_util [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.067 2 DEBUG nova.network.os_vif_util [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.068 2 DEBUG os_vif [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.067 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eac05697-bd82-4aa1-a5d0-3f487268f348]: (4, ('Tue Sep 30 09:48:11 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a (cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd)\ncf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd\nTue Sep 30 09:48:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a (cf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd)\ncf7a043f21b26f514ddedf01b13d29d43a297d7635086116770b9d699744a8bd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.069 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4735f435-5d35-4089-ac26-9e451ea97af0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.070 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde17a3c3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f61a4f1-d1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:12 compute-0 kernel: tapde17a3c3-f0: left promiscuous mode
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.092 2 INFO os_vif [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:d5:60,bridge_name='br-int',has_traffic_filtering=True,id=3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6,network=Network(de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3f61a4f1-d1')
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.092 2 INFO nova.virt.libvirt.driver [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Deleting instance files /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b_del
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.093 2 INFO nova.virt.libvirt.driver [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Deletion of /var/lib/nova/instances/c16f682b-0141-4298-b21c-d374321c2f4b_del complete
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.093 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a866a1dc-7080-475e-b4fd-ce058f8c0852]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.122 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[246e8037-cdd2-4796-8e50-4bb7b0e186db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.123 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6698a0f4-cbff-4042-bf35-0011d0df1188]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.138 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c246d0-a80b-438c-9919-e2611baaf286]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552700, 'reachable_time': 17972, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245424, 'error': None, 'target': 'ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 systemd[1]: run-netns-ovnmeta\x2dde17a3c3\x2dfc0e\x2d4c83\x2da7b7\x2d9fed6068ca7a.mount: Deactivated successfully.
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.141 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:48:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:12.141 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b23bb022-5a1c-4a1a-9fbf-2ac732af3abc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.186 2 INFO nova.compute.manager [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Took 0.41 seconds to destroy the instance on the hypervisor.
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.187 2 DEBUG oslo.service.loopingcall [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.187 2 DEBUG nova.compute.manager [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.187 2 DEBUG nova.network.neutron [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.328 2 DEBUG nova.compute.manager [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-unplugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.328 2 DEBUG oslo_concurrency.lockutils [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.328 2 DEBUG oslo_concurrency.lockutils [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.328 2 DEBUG oslo_concurrency.lockutils [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.329 2 DEBUG nova.compute.manager [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] No waiting events found dispatching network-vif-unplugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.329 2 DEBUG nova.compute.manager [req-b0f61ad9-f141-4dca-97d2-4b8059e16bd0 req-06430f78-3ea5-4d37-9ddc-ff96b6e1e579 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-unplugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.910 2 DEBUG nova.network.neutron [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:12 compute-0 nova_compute[192810]: 2025-09-30 21:48:12.936 2 INFO nova.compute.manager [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Took 0.75 seconds to deallocate network for instance.
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.036 2 DEBUG nova.compute.manager [req-5180ac16-d1ab-4cdc-b871-b368571d562d req-d8bf133f-2e0b-4e51-a902-eddb1ae633f2 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-deleted-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.053 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.054 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.130 2 DEBUG nova.compute.provider_tree [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.153 2 DEBUG nova.scheduler.client.report [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.189 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.268 2 INFO nova.scheduler.client.report [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance c16f682b-0141-4298-b21c-d374321c2f4b
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.346 2 DEBUG oslo_concurrency.lockutils [None req-da71e416-6923-46ff-a889-3b4faeb70747 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.430 2 DEBUG nova.network.neutron [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updated VIF entry in instance network info cache for port 3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.430 2 DEBUG nova.network.neutron [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Updating instance_info_cache with network_info: [{"id": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "address": "fa:16:3e:52:d5:60", "network": {"id": "de17a3c3-fc0e-4c83-a7b7-9fed6068ca7a", "bridge": "br-int", "label": "tempest-network-smoke--286237116", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3f61a4f1-d1", "ovs_interfaceid": "3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:13 compute-0 nova_compute[192810]: 2025-09-30 21:48:13.468 2 DEBUG oslo_concurrency.lockutils [req-f94ddf4e-0d1c-4607-aa1a-d81e82a32062 req-0231848e-8495-4ba6-82db-88716f31af8b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c16f682b-0141-4298-b21c-d374321c2f4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.474 2 DEBUG nova.compute.manager [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.475 2 DEBUG oslo_concurrency.lockutils [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.475 2 DEBUG oslo_concurrency.lockutils [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.475 2 DEBUG oslo_concurrency.lockutils [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c16f682b-0141-4298-b21c-d374321c2f4b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.475 2 DEBUG nova.compute.manager [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] No waiting events found dispatching network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:14 compute-0 nova_compute[192810]: 2025-09-30 21:48:14.476 2 WARNING nova.compute.manager [req-b6c6066c-9595-4f00-a96d-08ff2ca94a30 req-ff53d1db-09c3-4d45-b9f9-8014c8be0977 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Received unexpected event network-vif-plugged-3f61a4f1-d1d5-4cc4-96fe-839b3becc5e6 for instance with vm_state deleted and task_state None.
Sep 30 21:48:17 compute-0 nova_compute[192810]: 2025-09-30 21:48:17.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:18 compute-0 nova_compute[192810]: 2025-09-30 21:48:18.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:18 compute-0 podman[245427]: 2025-09-30 21:48:18.320690464 +0000 UTC m=+0.054169433 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:48:18 compute-0 podman[245428]: 2025-09-30 21:48:18.328711259 +0000 UTC m=+0.059473531 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:48:18 compute-0 podman[245426]: 2025-09-30 21:48:18.351610138 +0000 UTC m=+0.086978833 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Sep 30 21:48:22 compute-0 nova_compute[192810]: 2025-09-30 21:48:22.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:22 compute-0 nova_compute[192810]: 2025-09-30 21:48:22.134 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268887.1335466, ac9a07b2-30ac-48eb-aa5a-1ca2dd051899 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:22 compute-0 nova_compute[192810]: 2025-09-30 21:48:22.135 2 INFO nova.compute.manager [-] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] VM Stopped (Lifecycle Event)
Sep 30 21:48:22 compute-0 nova_compute[192810]: 2025-09-30 21:48:22.160 2 DEBUG nova.compute.manager [None req-eb87b9ba-095e-485f-bfbb-17fbe422292c - - - - - -] [instance: ac9a07b2-30ac-48eb-aa5a-1ca2dd051899] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:23 compute-0 nova_compute[192810]: 2025-09-30 21:48:23.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:26 compute-0 podman[245488]: 2025-09-30 21:48:26.342527337 +0000 UTC m=+0.079345986 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Sep 30 21:48:26 compute-0 podman[245487]: 2025-09-30 21:48:26.342997849 +0000 UTC m=+0.084101073 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:48:27 compute-0 nova_compute[192810]: 2025-09-30 21:48:27.037 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268892.0362833, c16f682b-0141-4298-b21c-d374321c2f4b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:27 compute-0 nova_compute[192810]: 2025-09-30 21:48:27.038 2 INFO nova.compute.manager [-] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] VM Stopped (Lifecycle Event)
Sep 30 21:48:27 compute-0 nova_compute[192810]: 2025-09-30 21:48:27.059 2 DEBUG nova.compute.manager [None req-ca88178d-a1f9-4748-86fa-5cb237e97812 - - - - - -] [instance: c16f682b-0141-4298-b21c-d374321c2f4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:27 compute-0 nova_compute[192810]: 2025-09-30 21:48:27.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:28 compute-0 nova_compute[192810]: 2025-09-30 21:48:28.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:30.458 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2c:a9:51 2001:db8:0:1:f816:3eff:fe2c:a951 2001:db8::f816:3eff:fe2c:a951'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe2c:a951/64 2001:db8::f816:3eff:fe2c:a951/64', 'neutron:device_id': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5) old=Port_Binding(mac=['fa:16:3e:2c:a9:51 2001:db8::f816:3eff:fe2c:a951'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2c:a951/64', 'neutron:device_id': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:30.459 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 updated
Sep 30 21:48:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:30.460 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1ec18dd-20d4-4643-8e73-7d404d8b8493, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:48:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:30.462 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfb95b9-00b7-4cd7-9490-ed3bd0c089be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:32 compute-0 nova_compute[192810]: 2025-09-30 21:48:32.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:33 compute-0 nova_compute[192810]: 2025-09-30 21:48:33.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:34 compute-0 podman[245533]: 2025-09-30 21:48:34.325214416 +0000 UTC m=+0.048310019 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:48:34 compute-0 podman[245531]: 2025-09-30 21:48:34.332032293 +0000 UTC m=+0.062077476 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS)
Sep 30 21:48:34 compute-0 podman[245532]: 2025-09-30 21:48:34.351369674 +0000 UTC m=+0.067680232 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.595 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.595 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.622 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.768 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.769 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.774 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.774 2 INFO nova.compute.claims [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.898 2 DEBUG nova.compute.provider_tree [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.913 2 DEBUG nova.scheduler.client.report [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.942 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:36 compute-0 nova_compute[192810]: 2025-09-30 21:48:36.943 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.014 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.014 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.032 2 INFO nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.053 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.180 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.181 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.181 2 INFO nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Creating image(s)
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.182 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.182 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.182 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.193 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.249 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.250 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.251 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.261 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.313 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.314 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.703 2 DEBUG nova.policy [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.824 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk 1073741824" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.825 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.825 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.880 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.881 2 DEBUG nova.virt.disk.api [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Checking if we can resize image /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.882 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.944 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.945 2 DEBUG nova.virt.disk.api [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Cannot resize image /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.946 2 DEBUG nova.objects.instance [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'migration_context' on Instance uuid 4ee4a775-05d1-45fc-b4f9-566ab8159710 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.960 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.961 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Ensure instance console log exists: /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.961 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.962 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:37 compute-0 nova_compute[192810]: 2025-09-30 21:48:37.962 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:38 compute-0 nova_compute[192810]: 2025-09-30 21:48:38.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:38 compute-0 nova_compute[192810]: 2025-09-30 21:48:38.571 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Successfully created port: 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:48:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:38.755 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:38.756 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:38.756 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:40 compute-0 nova_compute[192810]: 2025-09-30 21:48:40.118 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Successfully updated port: 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:48:40 compute-0 nova_compute[192810]: 2025-09-30 21:48:40.138 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:40 compute-0 nova_compute[192810]: 2025-09-30 21:48:40.139 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquired lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:40 compute-0 nova_compute[192810]: 2025-09-30 21:48:40.140 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:48:40 compute-0 nova_compute[192810]: 2025-09-30 21:48:40.635 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:48:41 compute-0 nova_compute[192810]: 2025-09-30 21:48:41.139 2 DEBUG nova.compute.manager [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:41 compute-0 nova_compute[192810]: 2025-09-30 21:48:41.139 2 DEBUG nova.compute.manager [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing instance network info cache due to event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:41 compute-0 nova_compute[192810]: 2025-09-30 21:48:41.140 2 DEBUG oslo_concurrency.lockutils [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.645 2 DEBUG nova.network.neutron [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updating instance_info_cache with network_info: [{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.693 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Releasing lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.693 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance network_info: |[{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.693 2 DEBUG oslo_concurrency.lockutils [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.694 2 DEBUG nova.network.neutron [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.696 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Start _get_guest_xml network_info=[{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.701 2 WARNING nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.707 2 DEBUG nova.virt.libvirt.host [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.708 2 DEBUG nova.virt.libvirt.host [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.713 2 DEBUG nova.virt.libvirt.host [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.713 2 DEBUG nova.virt.libvirt.host [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.714 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.714 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.715 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.715 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.715 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.715 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.716 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.716 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.716 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.716 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.716 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.717 2 DEBUG nova.virt.hardware [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.720 2 DEBUG nova.virt.libvirt.vif [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=162,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP8+ySuIworn5+VdMqrwf2ZDaHL6j+oMxWlSGL0WjntH9xtXyOZttwOPpv9QtW35sVLgwjU06yIyW6DTo/xQcotOSzqfnBCevinC2EH2rKNVNrC+n51UyDfRK5Jp7gOR1A==',key_name='tempest-TestSecurityGroupsBasicOps-670853740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-w00y4yxk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:37Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=4ee4a775-05d1-45fc-b4f9-566ab8159710,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.720 2 DEBUG nova.network.os_vif_util [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.721 2 DEBUG nova.network.os_vif_util [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.721 2 DEBUG nova.objects.instance [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4ee4a775-05d1-45fc-b4f9-566ab8159710 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.734 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <uuid>4ee4a775-05d1-45fc-b4f9-566ab8159710</uuid>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <name>instance-000000a2</name>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209</nova:name>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:48:42</nova:creationTime>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:user uuid="c33a752ef8234bba917ace1e73763490">tempest-TestSecurityGroupsBasicOps-2108116341-project-member</nova:user>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:project uuid="1ff42902541948f7a6df344fac87c2b7">tempest-TestSecurityGroupsBasicOps-2108116341</nova:project>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         <nova:port uuid="9a0264ee-cc33-48a8-b015-36ebc5bfdd43">
Sep 30 21:48:42 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <system>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="serial">4ee4a775-05d1-45fc-b4f9-566ab8159710</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="uuid">4ee4a775-05d1-45fc-b4f9-566ab8159710</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </system>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <os>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </os>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <features>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </features>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.config"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:6f:cf:2b"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <target dev="tap9a0264ee-cc"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/console.log" append="off"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <video>
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </video>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:48:42 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:48:42 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:48:42 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:48:42 compute-0 nova_compute[192810]: </domain>
Sep 30 21:48:42 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.736 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Preparing to wait for external event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.737 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.737 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.737 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.738 2 DEBUG nova.virt.libvirt.vif [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=162,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP8+ySuIworn5+VdMqrwf2ZDaHL6j+oMxWlSGL0WjntH9xtXyOZttwOPpv9QtW35sVLgwjU06yIyW6DTo/xQcotOSzqfnBCevinC2EH2rKNVNrC+n51UyDfRK5Jp7gOR1A==',key_name='tempest-TestSecurityGroupsBasicOps-670853740',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-w00y4yxk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:37Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=4ee4a775-05d1-45fc-b4f9-566ab8159710,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.738 2 DEBUG nova.network.os_vif_util [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.739 2 DEBUG nova.network.os_vif_util [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.739 2 DEBUG os_vif [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.742 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.745 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a0264ee-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.745 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9a0264ee-cc, col_values=(('external_ids', {'iface-id': '9a0264ee-cc33-48a8-b015-36ebc5bfdd43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6f:cf:2b', 'vm-uuid': '4ee4a775-05d1-45fc-b4f9-566ab8159710'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:42 compute-0 NetworkManager[51733]: <info>  [1759268922.7481] manager: (tap9a0264ee-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.754 2 INFO os_vif [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc')
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.881 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.882 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.882 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] No VIF found with MAC fa:16:3e:6f:cf:2b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:48:42 compute-0 nova_compute[192810]: 2025-09-30 21:48:42.883 2 INFO nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Using config drive
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.090 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.090 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.109 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.230 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.231 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.237 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.238 2 INFO nova.compute.claims [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.345 2 INFO nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Creating config drive at /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.config
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.350 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwovtorbd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.394 2 DEBUG nova.compute.provider_tree [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.418 2 DEBUG nova.scheduler.client.report [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.445 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.446 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.484 2 DEBUG oslo_concurrency.processutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwovtorbd" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.511 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.512 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.528 2 INFO nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:48:43 compute-0 kernel: tap9a0264ee-cc: entered promiscuous mode
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.5498] manager: (tap9a0264ee-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Sep 30 21:48:43 compute-0 ovn_controller[94912]: 2025-09-30T21:48:43Z|00611|binding|INFO|Claiming lport 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 for this chassis.
Sep 30 21:48:43 compute-0 ovn_controller[94912]: 2025-09-30T21:48:43Z|00612|binding|INFO|9a0264ee-cc33-48a8-b015-36ebc5bfdd43: Claiming fa:16:3e:6f:cf:2b 10.100.0.9
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.548 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.572 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:cf:2b 10.100.0.9'], port_security=['fa:16:3e:6f:cf:2b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62012511-a944-47e7-b858-e59eabaf741d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1e439ed0-2f15-472a-8dfd-6e6fb79309ff 6cc48b7c-7cab-41b4-b3d1-c55036450f25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=818bfa61-11e5-4d95-929c-1b2d08c53e7a, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=9a0264ee-cc33-48a8-b015-36ebc5bfdd43) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.573 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 in datapath 62012511-a944-47e7-b858-e59eabaf741d bound to our chassis
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.574 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 62012511-a944-47e7-b858-e59eabaf741d
Sep 30 21:48:43 compute-0 systemd-udevd[245625]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:48:43 compute-0 systemd-machined[152794]: New machine qemu-77-instance-000000a2.
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.587 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bc70c440-ee8f-4d5a-884e-a7fb1f35d79d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.589 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap62012511-a1 in ovnmeta-62012511-a944-47e7-b858-e59eabaf741d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.591 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap62012511-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.591 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee4d4b3-5814-4384-902c-f5adb773b04a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.592 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4963ca-b9a6-4cff-91fe-d00b0c1ab822]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.5974] device (tap9a0264ee-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.5983] device (tap9a0264ee-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.605 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[d52bdd73-ba68-4de1-ad95-60f257eb97db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_controller[94912]: 2025-09-30T21:48:43Z|00613|binding|INFO|Setting lport 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 ovn-installed in OVS
Sep 30 21:48:43 compute-0 ovn_controller[94912]: 2025-09-30T21:48:43Z|00614|binding|INFO|Setting lport 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 up in Southbound
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 systemd[1]: Started Virtual Machine qemu-77-instance-000000a2.
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.628 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[de8c99d8-6858-4268-9bd2-11a66bc57776]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.657 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a78b01d4-7768-4f5b-98d5-476275320dd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.662 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fb93d5c3-7888-4601-bf8f-7a6bd2c6699e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.6633] manager: (tap62012511-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/273)
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.694 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9965c20b-de25-4779-981a-2532f89fcd8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.697 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[95cc0e44-42bc-4a69-8b34-e944649b90b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.7180] device (tap62012511-a0): carrier: link connected
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.723 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[8fca2dbc-d168-4aa9-811e-344af3848d75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.725 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.726 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.727 2 INFO nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Creating image(s)
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.727 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.727 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.728 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.739 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.741 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0024639e-db96-4187-a193-a0709a31a854]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62012511-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:85:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561934, 'reachable_time': 29896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245659, 'error': None, 'target': 'ovnmeta-62012511-a944-47e7-b858-e59eabaf741d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.756 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e804551a-9eb6-4498-ba60-c52b9a15dff9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3f:85f3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 561934, 'tstamp': 561934}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245661, 'error': None, 'target': 'ovnmeta-62012511-a944-47e7-b858-e59eabaf741d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.762 2 DEBUG nova.policy [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.772 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[221a38ab-70ee-4ae4-82fe-b07e20d36541]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap62012511-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:85:f3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 185], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561934, 'reachable_time': 29896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245662, 'error': None, 'target': 'ovnmeta-62012511-a944-47e7-b858-e59eabaf741d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.799 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.799 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0d9706-ecfb-4b05-b476-074ef50223e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.800 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.800 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.811 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.848 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4635fb23-5ed3-4740-9739-48de35f44126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.850 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62012511-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.850 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.851 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62012511-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:43 compute-0 kernel: tap62012511-a0: entered promiscuous mode
Sep 30 21:48:43 compute-0 NetworkManager[51733]: <info>  [1759268923.8547] manager: (tap62012511-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.859 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap62012511-a0, col_values=(('external_ids', {'iface-id': '1dad0101-af5c-4580-b216-17a88dc97c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:43 compute-0 ovn_controller[94912]: 2025-09-30T21:48:43Z|00615|binding|INFO|Releasing lport 1dad0101-af5c-4580-b216-17a88dc97c78 from this chassis (sb_readonly=0)
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.862 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/62012511-a944-47e7-b858-e59eabaf741d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/62012511-a944-47e7-b858-e59eabaf741d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.869 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.870 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.869 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42f55f71-c15f-4607-835b-b3a1c194132f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.870 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-62012511-a944-47e7-b858-e59eabaf741d
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/62012511-a944-47e7-b858-e59eabaf741d.pid.haproxy
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 62012511-a944-47e7-b858-e59eabaf741d
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:48:43 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:43.871 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-62012511-a944-47e7-b858-e59eabaf741d', 'env', 'PROCESS_TAG=haproxy-62012511-a944-47e7-b858-e59eabaf741d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/62012511-a944-47e7-b858-e59eabaf741d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.911 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000a2', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '1ff42902541948f7a6df344fac87c2b7', 'user_id': 'c33a752ef8234bba917ace1e73763490', 'hostId': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.913 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.915 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4ee4a775-05d1-45fc-b4f9-566ab8159710 / tap9a0264ee-cc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.916 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4282185-65d6-4354-8ec2-96b2b28925c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:43.913277', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c3fd0f4-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '944b28c1d6f8fdc786864a5413c07a830bd55b82acbeb9a448a66982d0d52a7e'}]}, 'timestamp': '2025-09-30 21:48:43.916887', '_unique_id': '8fb2d99a8aa6442583428901c7ffa6f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.917 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.942 2 DEBUG nova.compute.manager [req-be0f7fb6-f403-4eed-b3f8-8439811d6df9 req-fe1733bc-48db-4661-b517-88bad4baf6cd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.944 2 DEBUG oslo_concurrency.lockutils [req-be0f7fb6-f403-4eed-b3f8-8439811d6df9 req-fe1733bc-48db-4661-b517-88bad4baf6cd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.945 2 DEBUG oslo_concurrency.lockutils [req-be0f7fb6-f403-4eed-b3f8-8439811d6df9 req-fe1733bc-48db-4661-b517-88bad4baf6cd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.945 2 DEBUG oslo_concurrency.lockutils [req-be0f7fb6-f403-4eed-b3f8-8439811d6df9 req-fe1733bc-48db-4661-b517-88bad4baf6cd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.945 2 DEBUG nova.compute.manager [req-be0f7fb6-f403-4eed-b3f8-8439811d6df9 req-fe1733bc-48db-4661-b517-88bad4baf6cd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Processing event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.972 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk 1073741824" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.973 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:43 compute-0 nova_compute[192810]: 2025-09-30 21:48:43.973 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.030 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.031 2 DEBUG nova.virt.disk.api [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.031 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.097 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.098 2 DEBUG nova.virt.disk.api [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.098 2 DEBUG nova.objects.instance [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.119 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.120 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Ensure instance console log exists: /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.120 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.121 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.121 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:44 compute-0 podman[245713]: 2025-09-30 21:48:44.213442442 +0000 UTC m=+0.024670242 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.347 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268924.3469994, 4ee4a775-05d1-45fc-b4f9-566ab8159710 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.347 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] VM Started (Lifecycle Event)
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.349 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.352 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.355 2 INFO nova.virt.libvirt.driver [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance spawned successfully.
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.355 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.355 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.356 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd73f504-30bd-4014-b4e4-4c9ddccc93ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:43.921011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c82f884-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': 'eee527644ea996370b73010b746a64dddf9e5d83cb506b4bdfb438258e106c1c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:43.921011', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8307ca-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': 'a45d419f98c780725999077b55aa4f3917d5528779693e8ad53273a2fb6bc275'}]}, 'timestamp': '2025-09-30 21:48:44.357346', '_unique_id': '19537519c62843f2b69fa92d515a94de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.358 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.366 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.371 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.375 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.375 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.376 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.376 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.377 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.377 2 DEBUG nova.virt.libvirt.driver [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.377 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.378 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0460e2b0-98ae-4a22-9ee6-7f3c99f57cca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.359534', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c863c1a-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '57a7ccfcddfe6a031242783f788732789b897cc12112187307e47fd4542e7cb0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.359534', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c86480e-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': 'bbc6a4c4b36e8e02fe18e5bb6de01c2b2cb64c2ac73144bedb62ca87f69a7b8c'}]}, 'timestamp': '2025-09-30 21:48:44.378651', '_unique_id': 'ad51e033637143ed9a2f08860d77dbd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.379 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.380 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.380 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.380 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f1e5c2-fbd3-4231-9eba-f46092af254a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.380617', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c869ff2-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '7a9862ac7bc28343d3d6539cbf5c30e4ebd4e66923b4b73d3d269db5d5093126'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.380617', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c86a90c-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '095df241245d6bd7c13557b054171911784b3d34f7593db97fcebf002b20efcc'}]}, 'timestamp': '2025-09-30 21:48:44.381103', '_unique_id': '8f23c3fc87414313a406aed310476c04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.381 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.382 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d4d1a9b-2ec7-4d60-888d-08b6747f9023', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.382530', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c86eb7e-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '8e8eca3747e6a35e3ad9971ec8c6c90e5787ef2904c342ff192b0c251b93c635'}]}, 'timestamp': '2025-09-30 21:48:44.382818', '_unique_id': '60045e38405c4ff092d49f3573ab2f23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.383 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.384 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '711a6f66-1ab7-4d12-ae1c-17b0d9ff6dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.384722', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c873fd4-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '55797a96bb9ee97a55c468fd0d05f566c646bc90181ff5b7cce41c29b069a2c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.384722', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8748da-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '748cf13f9f4574018686a02393786bad64107a1e25e4211b3070f16104065dea'}]}, 'timestamp': '2025-09-30 21:48:44.385196', '_unique_id': '347f69c17ebd40fba672d64cd6b2f030'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.385 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.386 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.399 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/cpu volume: 20000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f20dcf78-0f83-408d-8558-dc9002b0dedd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20000000, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'timestamp': '2025-09-30T21:48:44.386755', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3c898b7c-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.087034403, 'message_signature': '6c039a0b6954a084e2c551dec09d0f4658012e31f45e3412ab6b728d75cdfaad'}]}, 'timestamp': '2025-09-30 21:48:44.400083', '_unique_id': '65d67f53f69f477e95a824eaa77dac09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.400 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.402 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.402 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89dad81a-90be-46ec-94d5-6f8026346b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.402402', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c89f58a-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '79b2d74bb69c0ccda3ebfca9920cb3fa69fe3889e8a06fade43939eb65bfd821'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.402402', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8a0228-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '5f9ef39a7635511662dce8b33ec2ae774224d4e94f675f51d9b1cd826304fab2'}]}, 'timestamp': '2025-09-30 21:48:44.403088', '_unique_id': 'a73701f16a4f47b7af45a44c45ab3f49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.403 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.404 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.404 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd80033ad-f157-4105-8944-bcc13392dea6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.404755', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c8a50de-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': 'bce846e2927fa22a0f71a1352cb463268d8fa7a7eb00f63820d29cfc8e4bf7d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.404755', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8a5b1a-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': '22b013876af4aae2ebf428fcb660ce7d47f3e574c6e924daefdcbc44c5fe8d85'}]}, 'timestamp': '2025-09-30 21:48:44.405341', '_unique_id': 'db7913536a2a45c09abc012b41c8f26f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.405 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.407 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.incoming.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c667696c-6607-4a1a-8537-89c94285b83b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.407417', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8ab970-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '9a031ca421b37c0851c9e559e1ab6c81bd45d932131282efef2d9eee32a0caed'}]}, 'timestamp': '2025-09-30 21:48:44.407763', '_unique_id': 'a9f3f0e6e67b48fdbaf110285589e8e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.408 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.409 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '760a40c9-5be8-4454-8a83-c6b019cf8ab2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.409442', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8b0894-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': 'df82d4e71586ed13330dbbab7840f31f7eb7d25c821f29cb479f1ff61f34430e'}]}, 'timestamp': '2025-09-30 21:48:44.409811', '_unique_id': 'f0f7e605d00249098a319cd6667f4281'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.410 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.411 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.412 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1271820-97ce-4e96-8c5d-aa42fbb095d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.411427', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8b5452-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '59c4cf32a8e40f77e981dd0bf22ec95bf65debb37e0f2f5e2b1120d26892fdff'}]}, 'timestamp': '2025-09-30 21:48:44.411717', '_unique_id': 'de0c0d58d17c4a3eba70b99bf90d8a6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.412 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268924.347155, 4ee4a775-05d1-45fc-b4f9-566ab8159710 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.412 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.413 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] VM Paused (Lifecycle Event)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 4ee4a775-05d1-45fc-b4f9-566ab8159710: ceilometer.compute.pollsters.NoVolumeException
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.413 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa5449b0-3571-4526-ad3d-713b71eac410', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.413525', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c8ba92a-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': '574fd5b404d7a167fd27355404cf8539500389341b8e747ed70132b1dba8637a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.413525', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8bb2e4-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': 'c662f5674e05a3987bf6f201bcbb62e7e586a07e6d26390476bc11ea28795468'}]}, 'timestamp': '2025-09-30 21:48:44.414126', '_unique_id': '61d80d9c91184776ab9d2e3646773ef1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.414 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.415 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4945708a-e268-48ba-81b1-29f0cd0d3eb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.415778', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c8bff88-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': 'abe2d32fcdb8b859ce66db33aff39e60cdf43b5936913ab065d4d1512bb9f9d9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.415778', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8c08c0-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5620.047070448, 'message_signature': 'e5a34021838e3b85ed8ef101382a65a361342f07b1bac8d2e867eaa1b0213244'}]}, 'timestamp': '2025-09-30 21:48:44.416320', '_unique_id': '87729902ec0a4e01b8a44ec368d3d3ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.416 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.417 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.incoming.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b781b61e-2432-4f43-9062-ff94a8fd8ba3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.417812', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8c4cf4-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '72773f8677a1fdf409057813720f3ad1871382cdac986c351a38058e80027ee9'}]}, 'timestamp': '2025-09-30 21:48:44.418081', '_unique_id': 'd1c7928c9e624016abec88bffa3b8d71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.418 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.419 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.419 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.419 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77176292-18a4-4917-9127-6e30dd6fbe24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-vda', 'timestamp': '2025-09-30T21:48:44.419529', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3c8c922c-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': '0ef6febff1a8e99b86092662f04801e410ad72016080e37ce66954c9ec02d3e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710-sda', 'timestamp': '2025-09-30T21:48:44.419529', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'instance-000000a2', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3c8c9e8e-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.608514451, 'message_signature': '981795044348dc0155365397702215ebc2c602a3b42a4b0bbb3ba83db0ec8f4d'}]}, 'timestamp': '2025-09-30 21:48:44.420197', '_unique_id': '27f4f0daae6247e583b086a4b23044e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.420 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.422 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.422 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.422 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89f3c8bb-9d7f-4dbf-9b23-e87164f0f763', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.422533', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8d0716-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '045a9d8e0f1dcbfaa453c6011b53ed99c9ac03a3b7e79b250909a8b64d96fffc'}]}, 'timestamp': '2025-09-30 21:48:44.422864', '_unique_id': '6d0e2c8425614d83859ae9fbc86cc41c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.423 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.424 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.424 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.425 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.425 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97e436b6-676b-4e00-be61-c5335a06aab0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.425344', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8d8074-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '81bf76f0d91198009fc4bf2fe89ea43a15404308fb2e1d83d5be66ed6fa94e49'}]}, 'timestamp': '2025-09-30 21:48:44.425968', '_unique_id': '5fe54b68682c4aa28b29d81a1a0857c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.426 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.427 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.427 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b90b7dc-b885-4926-b7a1-db62ffd226d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.427579', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8dca98-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '86677b91dadb8e4a8ce43ee505ce6f95d778ead8abf1d5f9a126c6f4b0d5e1f0'}]}, 'timestamp': '2025-09-30 21:48:44.427866', '_unique_id': 'c336b965bc1e4783b4d740d41943fd90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.428 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.429 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.429 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.429 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209>]
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.429 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.429 12 DEBUG ceilometer.compute.pollsters [-] 4ee4a775-05d1-45fc-b4f9-566ab8159710/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.429 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94e39936-eb07-4cdc-a49d-7d9934e8ba7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'c33a752ef8234bba917ace1e73763490', 'user_name': None, 'project_id': '1ff42902541948f7a6df344fac87c2b7', 'project_name': None, 'resource_id': 'instance-000000a2-4ee4a775-05d1-45fc-b4f9-566ab8159710-tap9a0264ee-cc', 'timestamp': '2025-09-30T21:48:44.429713', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209', 'name': 'tap9a0264ee-cc', 'instance_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'instance_type': 'm1.nano', 'host': 'b026ae5ee459f2114e282edfc7173c1634c0f1835360be8cb37a0e87', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'image_ref': '86b6907c-d747-4e98-8897-42105915831d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6f:cf:2b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9a0264ee-cc'}, 'message_id': '3c8e1d90-9e47-11f0-a153-fa163e09b122', 'monotonic_time': 5619.600781932, 'message_signature': '7f3b72b01bebc912abfac075195732de7a358e3d9f513df1dc169e97cfef0406'}]}, 'timestamp': '2025-09-30 21:48:44.429973', '_unique_id': '7035f6095197461eb359117c5d081596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     yield
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Sep 30 21:48:44 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:48:44.430 12 ERROR oslo_messaging.notify.messaging 
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.437 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268924.3518765, 4ee4a775-05d1-45fc-b4f9-566ab8159710 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.437 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] VM Resumed (Lifecycle Event)
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.463 2 INFO nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Took 7.28 seconds to spawn the instance on the hypervisor.
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.463 2 DEBUG nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.473 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.475 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:44 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.516 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.575 2 INFO nova.compute.manager [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Took 7.84 seconds to build instance.
Sep 30 21:48:44 compute-0 podman[245713]: 2025-09-30 21:48:44.589760161 +0000 UTC m=+0.400987931 container create 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.604 2 DEBUG oslo_concurrency.lockutils [None req-b7c670bd-b830-4472-a977-4d6557dfc149 c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.008s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.647 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Successfully created port: 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.667 2 DEBUG nova.network.neutron [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updated VIF entry in instance network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.668 2 DEBUG nova.network.neutron [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updating instance_info_cache with network_info: [{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:44 compute-0 nova_compute[192810]: 2025-09-30 21:48:44.683 2 DEBUG oslo_concurrency.lockutils [req-00e15836-2f2e-4f23-ab67-517f433697a9 req-4a73079a-3f86-4d44-ab88-cbdc16e2c4c3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:44 compute-0 systemd[1]: Started libpod-conmon-7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611.scope.
Sep 30 21:48:44 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:48:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e51f8436c345a32ba7a85d9912b0f31aba28ac7e78ab022322e8ff5e9e272fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:48:44 compute-0 podman[245713]: 2025-09-30 21:48:44.900326796 +0000 UTC m=+0.711554586 container init 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS)
Sep 30 21:48:44 compute-0 podman[245713]: 2025-09-30 21:48:44.905527763 +0000 UTC m=+0.716755533 container start 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:48:44 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [NOTICE]   (245733) : New worker (245735) forked
Sep 30 21:48:44 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [NOTICE]   (245733) : Loading success.
Sep 30 21:48:45 compute-0 nova_compute[192810]: 2025-09-30 21:48:45.473 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Successfully updated port: 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:48:45 compute-0 nova_compute[192810]: 2025-09-30 21:48:45.489 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:45 compute-0 nova_compute[192810]: 2025-09-30 21:48:45.489 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:45 compute-0 nova_compute[192810]: 2025-09-30 21:48:45.489 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:48:45 compute-0 nova_compute[192810]: 2025-09-30 21:48:45.822 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.006 2 DEBUG nova.compute.manager [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.006 2 DEBUG nova.compute.manager [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing instance network info cache due to event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.007 2 DEBUG oslo_concurrency.lockutils [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.165 2 DEBUG nova.compute.manager [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.166 2 DEBUG oslo_concurrency.lockutils [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.167 2 DEBUG oslo_concurrency.lockutils [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.167 2 DEBUG oslo_concurrency.lockutils [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.167 2 DEBUG nova.compute.manager [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] No waiting events found dispatching network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.168 2 WARNING nova.compute.manager [req-65a2cc41-6b96-4845-9f04-06b3b42c1c15 req-a30f7f1e-1e78-4cc3-9d97-188918e5b668 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received unexpected event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 for instance with vm_state active and task_state None.
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.932 2 DEBUG nova.network.neutron [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.963 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.964 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance network_info: |[{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.964 2 DEBUG oslo_concurrency.lockutils [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.964 2 DEBUG nova.network.neutron [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.967 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Start _get_guest_xml network_info=[{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.970 2 WARNING nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.975 2 DEBUG nova.virt.libvirt.host [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.976 2 DEBUG nova.virt.libvirt.host [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.989 2 DEBUG nova.virt.libvirt.host [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.990 2 DEBUG nova.virt.libvirt.host [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.991 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.991 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.992 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.992 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.992 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.992 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.993 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.993 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.993 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.993 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.993 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.994 2 DEBUG nova.virt.hardware [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.997 2 DEBUG nova.virt.libvirt.vif [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1118195670',display_name='tempest-TestNetworkAdvancedServerOps-server-1118195670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1118195670',id=164,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDpYp0NCufFhMupTnpTCr4Vcpo04rzkSDzZOneKedO728EI3/G29UkyaAkIHtV3QtRBxeFQEO2d2Hne8odaDxHvM2zlyxvg24YgrhCa5Ls4Q4dyozrdXslYt9KnqD2nxg==',key_name='tempest-TestNetworkAdvancedServerOps-296871907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-0reom9zn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:43Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.997 2 DEBUG nova.network.os_vif_util [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.998 2 DEBUG nova.network.os_vif_util [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:46 compute-0 nova_compute[192810]: 2025-09-30 21:48:46.999 2 DEBUG nova.objects.instance [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.015 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <uuid>a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8</uuid>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <name>instance-000000a4</name>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1118195670</nova:name>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:48:46</nova:creationTime>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         <nova:port uuid="2b51d8f4-bdd2-4223-8c9f-2bff82319a8a">
Sep 30 21:48:47 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <system>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="serial">a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="uuid">a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </system>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <os>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </os>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <features>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </features>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.config"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:8b:fb:b1"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <target dev="tap2b51d8f4-bd"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/console.log" append="off"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <video>
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </video>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:48:47 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:48:47 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:48:47 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:48:47 compute-0 nova_compute[192810]: </domain>
Sep 30 21:48:47 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.017 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Preparing to wait for external event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.017 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.017 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.017 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.018 2 DEBUG nova.virt.libvirt.vif [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:48:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1118195670',display_name='tempest-TestNetworkAdvancedServerOps-server-1118195670',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1118195670',id=164,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDpYp0NCufFhMupTnpTCr4Vcpo04rzkSDzZOneKedO728EI3/G29UkyaAkIHtV3QtRBxeFQEO2d2Hne8odaDxHvM2zlyxvg24YgrhCa5Ls4Q4dyozrdXslYt9KnqD2nxg==',key_name='tempest-TestNetworkAdvancedServerOps-296871907',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-0reom9zn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:48:43Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.018 2 DEBUG nova.network.os_vif_util [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.019 2 DEBUG nova.network.os_vif_util [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.019 2 DEBUG os_vif [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.020 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.020 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.023 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2b51d8f4-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.023 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2b51d8f4-bd, col_values=(('external_ids', {'iface-id': '2b51d8f4-bdd2-4223-8c9f-2bff82319a8a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:fb:b1', 'vm-uuid': 'a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.0255] manager: (tap2b51d8f4-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/275)
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.032 2 INFO os_vif [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd')
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.118 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.118 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.119 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:8b:fb:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.119 2 INFO nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Using config drive
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.523 2 INFO nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Creating config drive at /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.config
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.527 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52bvi906 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.651 2 DEBUG oslo_concurrency.processutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp52bvi906" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:48:47 compute-0 kernel: tap2b51d8f4-bd: entered promiscuous mode
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.7035] manager: (tap2b51d8f4-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/276)
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 ovn_controller[94912]: 2025-09-30T21:48:47Z|00616|binding|INFO|Claiming lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for this chassis.
Sep 30 21:48:47 compute-0 ovn_controller[94912]: 2025-09-30T21:48:47Z|00617|binding|INFO|2b51d8f4-bdd2-4223-8c9f-2bff82319a8a: Claiming fa:16:3e:8b:fb:b1 10.100.0.13
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.720 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:fb:b1 10.100.0.13'], port_security=['fa:16:3e:8b:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '877e3925-1208-4d78-a647-d6e9b6515df1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f06fb115-6230-4a0b-83c9-81667590decb, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.721 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a in datapath dd58cd77-6f38-46a4-b4e3-b10538139b43 bound to our chassis
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.722 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.733 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[da925837-9dc1-427a-b917-763ec57a1c8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.734 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd58cd77-61 in ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.736 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd58cd77-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.736 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b2c04ab7-8f7b-4f70-ab88-efc00c0e2ade]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.737 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[db44626f-b0ec-4d29-a9d0-a43023f277ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.7468] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/277)
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.7474] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.748 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[cfcaa8d6-ac59-42c4-be94-fd24119144bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 systemd-machined[152794]: New machine qemu-78-instance-000000a4.
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 systemd[1]: Started Virtual Machine qemu-78-instance-000000a4.
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 systemd-udevd[245767]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.780 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8c015239-2170-4512-924f-c967bcd2df83]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_controller[94912]: 2025-09-30T21:48:47Z|00618|binding|INFO|Releasing lport 1dad0101-af5c-4580-b216-17a88dc97c78 from this chassis (sb_readonly=0)
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.7926] device (tap2b51d8f4-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.7937] device (tap2b51d8f4-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.814 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa71239-11e1-4fdc-90b7-4da863c7e0f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.8311] manager: (tapdd58cd77-60): new Veth device (/org/freedesktop/NetworkManager/Devices/279)
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.830 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9cbae638-be4b-4529-9416-e2aa71ba4000]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.870 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0effea-280d-4cf3-b4c6-72805a67a42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.873 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[5c19a619-9355-4ec6-b153-c520634cd35a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 NetworkManager[51733]: <info>  [1759268927.8986] device (tapdd58cd77-60): carrier: link connected
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.904 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4a0d2bce-b6a1-4cef-8522-503325592a39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_controller[94912]: 2025-09-30T21:48:47Z|00619|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a ovn-installed in OVS
Sep 30 21:48:47 compute-0 ovn_controller[94912]: 2025-09-30T21:48:47Z|00620|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a up in Southbound
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.923 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7101eedc-1fb6-4e97-8f11-be19bdd45efb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd58cd77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:65:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562352, 'reachable_time': 44355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245797, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 nova_compute[192810]: 2025-09-30 21:48:47.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.939 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0703ed0b-286f-4e0c-b39c-30151dac739a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:6585'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 562352, 'tstamp': 562352}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 245798, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.958 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1a884942-9495-4b2e-bc08-9fc457619799]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd58cd77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:65:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562352, 'reachable_time': 44355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 245799, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:47 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:47.990 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[51fe66dc-7eaf-4d58-8c89-2b76d46bf210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.043 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d5267b99-0fd8-4492-a148-296ab1342fbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.045 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd58cd77-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.045 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.045 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd58cd77-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-0 NetworkManager[51733]: <info>  [1759268928.0477] manager: (tapdd58cd77-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/280)
Sep 30 21:48:48 compute-0 kernel: tapdd58cd77-60: entered promiscuous mode
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.050 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd58cd77-60, col_values=(('external_ids', {'iface-id': 'aaea706f-be8f-4718-85d3-6723e1b10016'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-0 ovn_controller[94912]: 2025-09-30T21:48:48Z|00621|binding|INFO|Releasing lport aaea706f-be8f-4718-85d3-6723e1b10016 from this chassis (sb_readonly=0)
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.064 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.065 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dc63a2e8-a964-4048-aeff-a437c1f26564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.066 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:48:48 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:48.067 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'env', 'PROCESS_TAG=haproxy-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd58cd77-6f38-46a4-b4e3-b10538139b43.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.295 2 DEBUG nova.compute.manager [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.299 2 DEBUG nova.compute.manager [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing instance network info cache due to event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.300 2 DEBUG oslo_concurrency.lockutils [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.300 2 DEBUG oslo_concurrency.lockutils [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.301 2 DEBUG nova.network.neutron [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:48 compute-0 nova_compute[192810]: 2025-09-30 21:48:48.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:48 compute-0 podman[245831]: 2025-09-30 21:48:48.408780542 +0000 UTC m=+0.022986522 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:48:48 compute-0 podman[245831]: 2025-09-30 21:48:48.859640189 +0000 UTC m=+0.473846149 container create 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:48:48 compute-0 systemd[1]: Started libpod-conmon-019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8.scope.
Sep 30 21:48:48 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:48:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a48913e0c8b7f12e0f9ac14c951ecc42921bcf848fb94912cb7c364c924398b6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.022 2 DEBUG nova.network.neutron [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updated VIF entry in instance network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.023 2 DEBUG nova.network.neutron [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:49.046 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.052 2 DEBUG oslo_concurrency.lockutils [req-b5e05be0-4c36-486b-a307-f79d833f5210 req-e1ea52a3-98c7-4232-a091-733066d83824 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.094 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268929.0941405, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.095 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Started (Lifecycle Event)
Sep 30 21:48:49 compute-0 podman[245831]: 2025-09-30 21:48:49.097387518 +0000 UTC m=+0.711593488 container init 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:48:49 compute-0 podman[245854]: 2025-09-30 21:48:49.098652659 +0000 UTC m=+0.183390974 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:48:49 compute-0 podman[245853]: 2025-09-30 21:48:49.100925494 +0000 UTC m=+0.184140822 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:48:49 compute-0 podman[245831]: 2025-09-30 21:48:49.105097816 +0000 UTC m=+0.719303776 container start 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.118 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.123 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268929.0949879, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.124 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Paused (Lifecycle Event)
Sep 30 21:48:49 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [NOTICE]   (245907) : New worker (245909) forked
Sep 30 21:48:49 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [NOTICE]   (245907) : Loading success.
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.146 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.150 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:49 compute-0 nova_compute[192810]: 2025-09-30 21:48:49.179 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:49 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:49.281 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:48:49 compute-0 podman[245855]: 2025-09-30 21:48:49.333113618 +0000 UTC m=+0.414269026 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.364 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.364 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.365 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.365 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.365 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Processing event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.365 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.366 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.366 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.366 2 DEBUG oslo_concurrency.lockutils [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.367 2 DEBUG nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.367 2 WARNING nova.compute.manager [req-520ccc1a-8ff6-4dd5-a8d1-4b0a48879abb req-6d4b3479-3233-4463-83bb-98eefe43cf2f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state building and task_state spawning.
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.368 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.382 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268930.3707051, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.383 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Resumed (Lifecycle Event)
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.398 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.410 2 INFO nova.virt.libvirt.driver [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance spawned successfully.
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.411 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.495 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.496 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.497 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.497 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.498 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.498 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.499 2 DEBUG nova.virt.libvirt.driver [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.504 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.591 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.647 2 INFO nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Took 6.92 seconds to spawn the instance on the hypervisor.
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.648 2 DEBUG nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.756 2 DEBUG nova.network.neutron [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updated VIF entry in instance network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.757 2 DEBUG nova.network.neutron [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updating instance_info_cache with network_info: [{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.762 2 INFO nova.compute.manager [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Took 7.57 seconds to build instance.
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.787 2 DEBUG oslo_concurrency.lockutils [req-ef502213-c05c-4861-944d-aade8ec8bf2d req-f06022ab-4e96-47a7-b10d-4b4021a4a4f4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:48:50 compute-0 nova_compute[192810]: 2025-09-30 21:48:50.794 2 DEBUG oslo_concurrency.lockutils [None req-0dc2b287-bfd0-4065-98ca-57033f680c1d 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:48:52 compute-0 nova_compute[192810]: 2025-09-30 21:48:52.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:53 compute-0 nova_compute[192810]: 2025-09-30 21:48:53.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:57 compute-0 nova_compute[192810]: 2025-09-30 21:48:57.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:57 compute-0 podman[245940]: 2025-09-30 21:48:57.30290058 +0000 UTC m=+0.042773614 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:48:57 compute-0 podman[245941]: 2025-09-30 21:48:57.318229204 +0000 UTC m=+0.053838054 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container)
Sep 30 21:48:58 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:48:58.284 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.338 2 DEBUG nova.compute.manager [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.339 2 DEBUG nova.compute.manager [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing instance network info cache due to event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.340 2 DEBUG oslo_concurrency.lockutils [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.340 2 DEBUG oslo_concurrency.lockutils [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:48:58 compute-0 nova_compute[192810]: 2025-09-30 21:48:58.341 2 DEBUG nova.network.neutron [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:48:59 compute-0 nova_compute[192810]: 2025-09-30 21:48:59.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:01 compute-0 nova_compute[192810]: 2025-09-30 21:49:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:01 compute-0 nova_compute[192810]: 2025-09-30 21:49:01.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:49:02 compute-0 anacron[105679]: Job `cron.daily' started
Sep 30 21:49:02 compute-0 ovn_controller[94912]: 2025-09-30T21:49:02Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6f:cf:2b 10.100.0.9
Sep 30 21:49:02 compute-0 ovn_controller[94912]: 2025-09-30T21:49:02Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6f:cf:2b 10.100.0.9
Sep 30 21:49:02 compute-0 nova_compute[192810]: 2025-09-30 21:49:02.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:02 compute-0 anacron[105679]: Job `cron.daily' terminated
Sep 30 21:49:02 compute-0 nova_compute[192810]: 2025-09-30 21:49:02.159 2 DEBUG nova.network.neutron [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updated VIF entry in instance network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:02 compute-0 nova_compute[192810]: 2025-09-30 21:49:02.159 2 DEBUG nova.network.neutron [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:02 compute-0 nova_compute[192810]: 2025-09-30 21:49:02.193 2 DEBUG oslo_concurrency.lockutils [req-ff232773-fd2e-4e7a-ab3e-7100a1dbe393 req-ba5e6d47-7a52-4b03-9225-f7998bcc6869 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:02 compute-0 nova_compute[192810]: 2025-09-30 21:49:02.790 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:03 compute-0 nova_compute[192810]: 2025-09-30 21:49:03.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:05 compute-0 podman[246009]: 2025-09-30 21:49:05.317382343 +0000 UTC m=+0.047554071 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:49:05 compute-0 podman[246008]: 2025-09-30 21:49:05.317374933 +0000 UTC m=+0.052004800 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3)
Sep 30 21:49:05 compute-0 podman[246007]: 2025-09-30 21:49:05.318462849 +0000 UTC m=+0.056046438 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:49:06 compute-0 nova_compute[192810]: 2025-09-30 21:49:06.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:06 compute-0 nova_compute[192810]: 2025-09-30 21:49:06.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:49:06 compute-0 nova_compute[192810]: 2025-09-30 21:49:06.812 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:49:06 compute-0 nova_compute[192810]: 2025-09-30 21:49:06.813 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:07 compute-0 nova_compute[192810]: 2025-09-30 21:49:07.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:07 compute-0 ovn_controller[94912]: 2025-09-30T21:49:07Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:fb:b1 10.100.0.13
Sep 30 21:49:07 compute-0 ovn_controller[94912]: 2025-09-30T21:49:07Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:fb:b1 10.100.0.13
Sep 30 21:49:07 compute-0 nova_compute[192810]: 2025-09-30 21:49:07.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.812 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.812 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.813 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:49:08 compute-0 nova_compute[192810]: 2025-09-30 21:49:08.878 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.005 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.006 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.059 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.066 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.132 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.133 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.193 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.361 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5306MB free_disk=73.17237854003906GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.366 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.366 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.468 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 4ee4a775-05d1-45fc-b4f9-566ab8159710 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.469 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.469 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.470 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.533 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.549 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.571 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:49:09 compute-0 nova_compute[192810]: 2025-09-30 21:49:09.571 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:12 compute-0 nova_compute[192810]: 2025-09-30 21:49:12.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.346 2 INFO nova.compute.manager [None req-76c97d23-73bf-4478-9057-e9a2b24314d8 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Get console output
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.350 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.716 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.717 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.717 2 INFO nova.compute.manager [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Rebooting instance
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.735 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.736 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:13 compute-0 nova_compute[192810]: 2025-09-30 21:49:13.736 2 DEBUG nova.network.neutron [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:49:17 compute-0 nova_compute[192810]: 2025-09-30 21:49:17.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:18 compute-0 nova_compute[192810]: 2025-09-30 21:49:18.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:19 compute-0 podman[246089]: 2025-09-30 21:49:19.313247757 +0000 UTC m=+0.049942439 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:49:19 compute-0 podman[246088]: 2025-09-30 21:49:19.327277179 +0000 UTC m=+0.069600459 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller)
Sep 30 21:49:19 compute-0 nova_compute[192810]: 2025-09-30 21:49:19.829 2 DEBUG nova.network.neutron [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:19 compute-0 nova_compute[192810]: 2025-09-30 21:49:19.848 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:19 compute-0 nova_compute[192810]: 2025-09-30 21:49:19.860 2 DEBUG nova.compute.manager [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:20 compute-0 podman[246130]: 2025-09-30 21:49:20.313514085 +0000 UTC m=+0.051770674 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:22 compute-0 kernel: tap2b51d8f4-bd (unregistering): left promiscuous mode
Sep 30 21:49:22 compute-0 NetworkManager[51733]: <info>  [1759268962.2100] device (tap2b51d8f4-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:49:22 compute-0 ovn_controller[94912]: 2025-09-30T21:49:22Z|00622|binding|INFO|Releasing lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a from this chassis (sb_readonly=0)
Sep 30 21:49:22 compute-0 ovn_controller[94912]: 2025-09-30T21:49:22Z|00623|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a down in Southbound
Sep 30 21:49:22 compute-0 ovn_controller[94912]: 2025-09-30T21:49:22Z|00624|binding|INFO|Removing iface tap2b51d8f4-bd ovn-installed in OVS
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.232 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:fb:b1 10.100.0.13'], port_security=['fa:16:3e:8b:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '877e3925-1208-4d78-a647-d6e9b6515df1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f06fb115-6230-4a0b-83c9-81667590decb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.233 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a in datapath dd58cd77-6f38-46a4-b4e3-b10538139b43 unbound from our chassis
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.235 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd58cd77-6f38-46a4-b4e3-b10538139b43, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.236 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9f643a4a-909e-4043-bf18-44310332b312]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.236 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 namespace which is not needed anymore
Sep 30 21:49:22 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Sep 30 21:49:22 compute-0 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a4.scope: Consumed 15.503s CPU time.
Sep 30 21:49:22 compute-0 systemd-machined[152794]: Machine qemu-78-instance-000000a4 terminated.
Sep 30 21:49:22 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [NOTICE]   (245907) : haproxy version is 2.8.14-c23fe91
Sep 30 21:49:22 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [NOTICE]   (245907) : path to executable is /usr/sbin/haproxy
Sep 30 21:49:22 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [WARNING]  (245907) : Exiting Master process...
Sep 30 21:49:22 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [ALERT]    (245907) : Current worker (245909) exited with code 143 (Terminated)
Sep 30 21:49:22 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[245871]: [WARNING]  (245907) : All workers exited. Exiting... (0)
Sep 30 21:49:22 compute-0 systemd[1]: libpod-019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8.scope: Deactivated successfully.
Sep 30 21:49:22 compute-0 podman[246173]: 2025-09-30 21:49:22.411707352 +0000 UTC m=+0.094539507 container died 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:49:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:49:22 compute-0 systemd[1]: var-lib-containers-storage-overlay-a48913e0c8b7f12e0f9ac14c951ecc42921bcf848fb94912cb7c364c924398b6-merged.mount: Deactivated successfully.
Sep 30 21:49:22 compute-0 podman[246173]: 2025-09-30 21:49:22.76174693 +0000 UTC m=+0.444579085 container cleanup 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:49:22 compute-0 systemd[1]: libpod-conmon-019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8.scope: Deactivated successfully.
Sep 30 21:49:22 compute-0 podman[246221]: 2025-09-30 21:49:22.9167066 +0000 UTC m=+0.133636751 container remove 019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3)
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.922 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[be13e973-3e33-4c1a-9ae5-0c9711bf6b5c]: (4, ('Tue Sep 30 09:49:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 (019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8)\n019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8\nTue Sep 30 09:49:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 (019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8)\n019d7ea8dd4d974c02537b24a75794dc9c4d718a76920201914b70720c133ac8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.923 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1b53d215-14cd-4bb3-93db-7bf269b6d222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.924 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd58cd77-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:22 compute-0 kernel: tapdd58cd77-60: left promiscuous mode
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.946 2 DEBUG nova.compute.manager [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.946 2 DEBUG oslo_concurrency.lockutils [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.947 2 DEBUG oslo_concurrency.lockutils [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.947 2 DEBUG oslo_concurrency.lockutils [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.948 2 DEBUG nova.compute.manager [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.948 2 WARNING nova.compute.manager [req-04bf15b0-0fbe-45c1-bd24-1be7f999f283 req-836f0fde-2904-4288-a071-8aa90d499ca9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state active and task_state reboot_started.
Sep 30 21:49:22 compute-0 nova_compute[192810]: 2025-09-30 21:49:22.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.950 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d072ae2c-bc61-4929-8920-f08b19daa2da]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.976 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[40e34982-780a-4d55-95c3-933115bc3ace]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:22.978 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1ea0b3-29a2-4754-9baa-c3df03768082]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.002 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9fad7515-afa4-4f66-87f8-0914ba8b7aa0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 562343, 'reachable_time': 16664, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246240, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 systemd[1]: run-netns-ovnmeta\x2ddd58cd77\x2d6f38\x2d46a4\x2db4e3\x2db10538139b43.mount: Deactivated successfully.
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.007 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.007 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[b7c90c62-9546-45b6-81ec-45dd87019a48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.023 2 INFO nova.virt.libvirt.driver [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance shutdown successfully.
Sep 30 21:49:23 compute-0 kernel: tap2b51d8f4-bd: entered promiscuous mode
Sep 30 21:49:23 compute-0 systemd-udevd[246154]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.0822] manager: (tap2b51d8f4-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/281)
Sep 30 21:49:23 compute-0 ovn_controller[94912]: 2025-09-30T21:49:23Z|00625|binding|INFO|Claiming lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for this chassis.
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 ovn_controller[94912]: 2025-09-30T21:49:23Z|00626|binding|INFO|2b51d8f4-bdd2-4223-8c9f-2bff82319a8a: Claiming fa:16:3e:8b:fb:b1 10.100.0.13
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.0941] device (tap2b51d8f4-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.0951] device (tap2b51d8f4-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.102 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:fb:b1 10.100.0.13'], port_security=['fa:16:3e:8b:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '5', 'neutron:security_group_ids': '877e3925-1208-4d78-a647-d6e9b6515df1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f06fb115-6230-4a0b-83c9-81667590decb, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.103 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a in datapath dd58cd77-6f38-46a4-b4e3-b10538139b43 bound to our chassis
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.105 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:49:23 compute-0 ovn_controller[94912]: 2025-09-30T21:49:23Z|00627|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a ovn-installed in OVS
Sep 30 21:49:23 compute-0 ovn_controller[94912]: 2025-09-30T21:49:23Z|00628|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a up in Southbound
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.118 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[94e6ef01-418b-41e0-a36e-509e7e30cea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.119 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdd58cd77-61 in ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.120 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdd58cd77-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.120 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[146f975e-7fed-41ac-b331-0631604444de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.121 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b9af199a-e2f0-45c3-948c-662015ce3056]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 systemd-machined[152794]: New machine qemu-79-instance-000000a4.
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.130 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a877923a-e5cd-4100-82f9-fa7dc76a1616]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 systemd[1]: Started Virtual Machine qemu-79-instance-000000a4.
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.151 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f015f2e3-43b4-41b3-8dd4-8c0e79ad6d51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.176 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[49d3c710-fb06-41e1-a008-c6267d01944d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.181 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d6285409-ccf4-42bc-8f95-32f43bfb852e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.1823] manager: (tapdd58cd77-60): new Veth device (/org/freedesktop/NetworkManager/Devices/282)
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.211 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee12e2d-10e2-42e6-989c-af0703325a75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.214 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[37cb9f75-cab5-4ae8-a360-aa5c6d60c346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.2353] device (tapdd58cd77-60): carrier: link connected
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.241 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a87b6b01-5823-4fa2-8997-2d3292d8cae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.257 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5088e60f-798a-4d47-97b4-f3671dd536c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd58cd77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:65:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565885, 'reachable_time': 34105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246287, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.272 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4ce8e2-c946-40e1-8f18-a70b8d54a8ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:6585'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 565885, 'tstamp': 565885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246288, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.289 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6ff9e3b0-4c63-44c9-bf0f-8a12c7de451c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdd58cd77-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:65:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565885, 'reachable_time': 34105, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246289, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.318 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a42ab0b5-9e48-4de2-833a-45cf415a8aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.372 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[135d2107-ca0f-4eb1-a466-c5c54066e2ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.373 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd58cd77-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.374 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.374 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd58cd77-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:23 compute-0 NetworkManager[51733]: <info>  [1759268963.3764] manager: (tapdd58cd77-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/283)
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 kernel: tapdd58cd77-60: entered promiscuous mode
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.381 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdd58cd77-60, col_values=(('external_ids', {'iface-id': 'aaea706f-be8f-4718-85d3-6723e1b10016'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:23 compute-0 ovn_controller[94912]: 2025-09-30T21:49:23Z|00629|binding|INFO|Releasing lport aaea706f-be8f-4718-85d3-6723e1b10016 from this chassis (sb_readonly=0)
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.384 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.385 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcc74a4-1f72-4932-9792-90ef4ae95372]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.386 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/dd58cd77-6f38-46a4-b4e3-b10538139b43.pid.haproxy
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID dd58cd77-6f38-46a4-b4e3-b10538139b43
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:49:23 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:23.387 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'env', 'PROCESS_TAG=haproxy-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dd58cd77-6f38-46a4-b4e3-b10538139b43.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:49:23 compute-0 nova_compute[192810]: 2025-09-30 21:49:23.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:23 compute-0 podman[246320]: 2025-09-30 21:49:23.743527387 +0000 UTC m=+0.055403932 container create 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:49:23 compute-0 systemd[1]: Started libpod-conmon-87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe.scope.
Sep 30 21:49:23 compute-0 podman[246320]: 2025-09-30 21:49:23.709877146 +0000 UTC m=+0.021753691 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:49:23 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:49:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8c5957afe77e61bb064daed4b454590bd4f8c4ac6922013efa49666d02fb832/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:49:23 compute-0 podman[246320]: 2025-09-30 21:49:23.835094621 +0000 UTC m=+0.146971176 container init 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:49:23 compute-0 podman[246320]: 2025-09-30 21:49:23.840863391 +0000 UTC m=+0.152739926 container start 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:49:23 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [NOTICE]   (246340) : New worker (246342) forked
Sep 30 21:49:23 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [NOTICE]   (246340) : Loading success.
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.438 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.439 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268964.4381275, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.439 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Resumed (Lifecycle Event)
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.444 2 INFO nova.virt.libvirt.driver [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance running successfully.
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.444 2 INFO nova.virt.libvirt.driver [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance soft rebooted successfully.
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.445 2 DEBUG nova.compute.manager [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.468 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.471 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.499 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] During sync_power_state the instance has a pending task (reboot_started). Skip.
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.499 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268964.439209, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.499 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Started (Lifecycle Event)
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.521 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.527 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:24 compute-0 nova_compute[192810]: 2025-09-30 21:49:24.535 2 DEBUG oslo_concurrency.lockutils [None req-e48496af-b651-4af1-ae01-a5ba1655a9ef 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 10.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.050 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.051 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.051 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.051 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.052 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.052 2 WARNING nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state active and task_state None.
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.052 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.053 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.053 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.053 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.054 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.054 2 WARNING nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state active and task_state None.
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.054 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.054 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.055 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.055 2 DEBUG oslo_concurrency.lockutils [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.055 2 DEBUG nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:25 compute-0 nova_compute[192810]: 2025-09-30 21:49:25.056 2 WARNING nova.compute.manager [req-4bdaf650-a728-4f5c-85ac-a3b971b2f76b req-d470c8af-661f-4b08-a408-d939d876f1f8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state active and task_state None.
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.788 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.789 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.789 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.789 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.790 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.790 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.825 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.825 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Image id 86b6907c-d747-4e98-8897-42105915831d yields fingerprint e0a114b373fedfcc318870f9bde30baf716d5a2a _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.825 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] image 86b6907c-d747-4e98-8897-42105915831d at (/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a): checking
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.826 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] image 86b6907c-d747-4e98-8897-42105915831d at (/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.827 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.828 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] 4ee4a775-05d1-45fc-b4f9-566ab8159710 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.828 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] 4ee4a775-05d1-45fc-b4f9-566ab8159710 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.828 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.895 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.895 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 4ee4a775-05d1-45fc-b4f9-566ab8159710 is backed by e0a114b373fedfcc318870f9bde30baf716d5a2a _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.896 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.896 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 has a disk file _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:129
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.896 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.954 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.954 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 is backed by e0a114b373fedfcc318870f9bde30baf716d5a2a _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:141
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 WARNING nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Unknown base file: /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 WARNING nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Unknown base file: /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 WARNING nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 WARNING nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Unknown base file: /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 WARNING nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Unknown base file: /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Active base files: /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.955 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Removable base files: /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2 /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02 /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61 /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.956 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/d794a27f8e0bfa3eee9759fbfddd316a7671c61e
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.956 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/edd4aa4c8decf0f5a78c1abc69848feb809aa9e2
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.956 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a9ab869ec5a6bd299955a400c40ca9514f1f8c02
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.956 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/6b8d62627514116d042a4973bfd575f038df9d61
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.956 2 INFO nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/840763c52fc7ef035fd3f982d581e13eeabd4dd6
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.957 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.957 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Sep 30 21:49:27 compute-0 nova_compute[192810]: 2025-09-30 21:49:27.957 2 DEBUG nova.virt.libvirt.imagecache [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Sep 30 21:49:28 compute-0 podman[246365]: 2025-09-30 21:49:28.314502599 +0000 UTC m=+0.047324345 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:49:28 compute-0 podman[246366]: 2025-09-30 21:49:28.325328673 +0000 UTC m=+0.058103268 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc.)
Sep 30 21:49:28 compute-0 nova_compute[192810]: 2025-09-30 21:49:28.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.632 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.633 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.650 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.771 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.772 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.780 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.780 2 INFO nova.compute.claims [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.980 2 DEBUG nova.compute.provider_tree [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:29 compute-0 nova_compute[192810]: 2025-09-30 21:49:29.994 2 DEBUG nova.scheduler.client.report [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.015 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.016 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.090 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.091 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.112 2 INFO nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.135 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.296 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.298 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.298 2 INFO nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Creating image(s)
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.299 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.299 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.299 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.312 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.357 2 DEBUG nova.policy [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.378 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.380 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.381 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.394 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.459 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.460 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.630 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk 1073741824" returned: 0 in 0.170s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.631 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.631 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.686 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.687 2 DEBUG nova.virt.disk.api [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.687 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.740 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.741 2 DEBUG nova.virt.disk.api [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.742 2 DEBUG nova.objects.instance [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.755 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.756 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Ensure instance console log exists: /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.756 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.757 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:30 compute-0 nova_compute[192810]: 2025-09-30 21:49:30.757 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:31 compute-0 nova_compute[192810]: 2025-09-30 21:49:31.158 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Successfully created port: 15d18429-a32d-4645-bc6f-dcdc238c5de9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:49:32 compute-0 nova_compute[192810]: 2025-09-30 21:49:32.047 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Successfully created port: 15c82795-bc8d-4e4d-9949-ded082705cd7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:49:32 compute-0 nova_compute[192810]: 2025-09-30 21:49:32.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:32 compute-0 nova_compute[192810]: 2025-09-30 21:49:32.846 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Successfully updated port: 15d18429-a32d-4645-bc6f-dcdc238c5de9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.477 2 DEBUG nova.compute.manager [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.478 2 DEBUG nova.compute.manager [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing instance network info cache due to event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.478 2 DEBUG oslo_concurrency.lockutils [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.479 2 DEBUG oslo_concurrency.lockutils [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.479 2 DEBUG nova.network.neutron [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing network info cache for port 15d18429-a32d-4645-bc6f-dcdc238c5de9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.630 2 DEBUG nova.network.neutron [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.652 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Successfully updated port: 15c82795-bc8d-4e4d-9949-ded082705cd7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.667 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.936 2 DEBUG nova.network.neutron [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.958 2 DEBUG oslo_concurrency.lockutils [req-4ef7e8d8-44d7-495f-90ef-c22c32d89012 req-e29ceeaf-ee78-4428-a414-6373be919d41 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.959 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:33 compute-0 nova_compute[192810]: 2025-09-30 21:49:33.959 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:49:34 compute-0 nova_compute[192810]: 2025-09-30 21:49:34.107 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:49:35 compute-0 nova_compute[192810]: 2025-09-30 21:49:35.610 2 DEBUG nova.compute.manager [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-changed-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:35 compute-0 nova_compute[192810]: 2025-09-30 21:49:35.611 2 DEBUG nova.compute.manager [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing instance network info cache due to event network-changed-15c82795-bc8d-4e4d-9949-ded082705cd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:35 compute-0 nova_compute[192810]: 2025-09-30 21:49:35.612 2 DEBUG oslo_concurrency.lockutils [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:36 compute-0 podman[246440]: 2025-09-30 21:49:36.337608764 +0000 UTC m=+0.074351526 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:49:36 compute-0 podman[246441]: 2025-09-30 21:49:36.338290041 +0000 UTC m=+0.068065970 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3)
Sep 30 21:49:36 compute-0 podman[246442]: 2025-09-30 21:49:36.367404704 +0000 UTC m=+0.097107711 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:49:37 compute-0 nova_compute[192810]: 2025-09-30 21:49:37.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:37 compute-0 ovn_controller[94912]: 2025-09-30T21:49:37Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:fb:b1 10.100.0.13
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.365 2 DEBUG nova.network.neutron [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.383 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.383 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance network_info: |[{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.384 2 DEBUG oslo_concurrency.lockutils [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.384 2 DEBUG nova.network.neutron [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing network info cache for port 15c82795-bc8d-4e4d-9949-ded082705cd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.387 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Start _get_guest_xml network_info=[{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.392 2 WARNING nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.399 2 DEBUG nova.virt.libvirt.host [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.400 2 DEBUG nova.virt.libvirt.host [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.403 2 DEBUG nova.virt.libvirt.host [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.404 2 DEBUG nova.virt.libvirt.host [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.405 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.405 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.407 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.407 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.408 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.408 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.408 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.409 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.409 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.409 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.409 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.410 2 DEBUG nova.virt.hardware [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.414 2 DEBUG nova.virt.libvirt.vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:49:30Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.414 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.415 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.416 2 DEBUG nova.virt.libvirt.vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:49:30Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.417 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.417 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.418 2 DEBUG nova.objects.instance [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.433 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <uuid>2f0fbc4b-01ff-4422-bed8-aa7ca8934f51</uuid>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <name>instance-000000a7</name>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-1303425693</nova:name>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:49:38</nova:creationTime>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:port uuid="15d18429-a32d-4645-bc6f-dcdc238c5de9">
Sep 30 21:49:38 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         <nova:port uuid="15c82795-bc8d-4e4d-9949-ded082705cd7">
Sep 30 21:49:38 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe31:1ebf" ipVersion="6"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe31:1ebf" ipVersion="6"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <system>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="serial">2f0fbc4b-01ff-4422-bed8-aa7ca8934f51</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="uuid">2f0fbc4b-01ff-4422-bed8-aa7ca8934f51</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </system>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <os>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </os>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <features>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </features>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.config"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:23:0a:27"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <target dev="tap15d18429-a3"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:31:1e:bf"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <target dev="tap15c82795-bc"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/console.log" append="off"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <video>
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </video>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:49:38 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:49:38 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:49:38 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:49:38 compute-0 nova_compute[192810]: </domain>
Sep 30 21:49:38 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.434 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Preparing to wait for external event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.434 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.435 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.435 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.435 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Preparing to wait for external event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.435 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.435 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.436 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.436 2 DEBUG nova.virt.libvirt.vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:49:30Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.437 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.437 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.438 2 DEBUG os_vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15d18429-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15d18429-a3, col_values=(('external_ids', {'iface-id': '15d18429-a32d-4645-bc6f-dcdc238c5de9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:0a:27', 'vm-uuid': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 NetworkManager[51733]: <info>  [1759268978.4444] manager: (tap15d18429-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/284)
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.451 2 INFO os_vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3')
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.452 2 DEBUG nova.virt.libvirt.vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:49:30Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.452 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.453 2 DEBUG nova.network.os_vif_util [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.454 2 DEBUG os_vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.455 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.456 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.459 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15c82795-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.460 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15c82795-bc, col_values=(('external_ids', {'iface-id': '15c82795-bc8d-4e4d-9949-ded082705cd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:1e:bf', 'vm-uuid': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 NetworkManager[51733]: <info>  [1759268978.4627] manager: (tap15c82795-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.468 2 INFO os_vif [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc')
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.516 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.517 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.517 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:23:0a:27, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.517 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:31:1e:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:49:38 compute-0 nova_compute[192810]: 2025-09-30 21:49:38.518 2 INFO nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Using config drive
Sep 30 21:49:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:38.756 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:40 compute-0 nova_compute[192810]: 2025-09-30 21:49:40.807 2 INFO nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Creating config drive at /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.config
Sep 30 21:49:40 compute-0 nova_compute[192810]: 2025-09-30 21:49:40.813 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewhg418x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:49:40 compute-0 nova_compute[192810]: 2025-09-30 21:49:40.938 2 DEBUG oslo_concurrency.processutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewhg418x" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:49:40 compute-0 kernel: tap15d18429-a3: entered promiscuous mode
Sep 30 21:49:40 compute-0 NetworkManager[51733]: <info>  [1759268980.9969] manager: (tap15d18429-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/286)
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:40Z|00630|binding|INFO|Claiming lport 15d18429-a32d-4645-bc6f-dcdc238c5de9 for this chassis.
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:40Z|00631|binding|INFO|15d18429-a32d-4645-bc6f-dcdc238c5de9: Claiming fa:16:3e:23:0a:27 10.100.0.9
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.017 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:0a:27 10.100.0.9'], port_security=['fa:16:3e:23:0a:27 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfb1184d-a559-4543-ae06-e2b48bcfb2c3, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15d18429-a32d-4645-bc6f-dcdc238c5de9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.019 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15d18429-a32d-4645-bc6f-dcdc238c5de9 in datapath 40cfa99d-fae5-4f7e-b4bc-e90e389ced61 bound to our chassis
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00632|binding|INFO|Setting lport 15d18429-a32d-4645-bc6f-dcdc238c5de9 ovn-installed in OVS
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00633|binding|INFO|Setting lport 15d18429-a32d-4645-bc6f-dcdc238c5de9 up in Southbound
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.020 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 kernel: tap15c82795-bc: entered promiscuous mode
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.0276] manager: (tap15c82795-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Sep 30 21:49:41 compute-0 systemd-udevd[246526]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:41 compute-0 systemd-udevd[246527]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.033 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[78fdfd89-87e7-4620-aebe-0c50eff25f1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.034 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap40cfa99d-f1 in ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00634|binding|INFO|Claiming lport 15c82795-bc8d-4e4d-9949-ded082705cd7 for this chassis.
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00635|binding|INFO|15c82795-bc8d-4e4d-9949-ded082705cd7: Claiming fa:16:3e:31:1e:bf 2001:db8:0:1:f816:3eff:fe31:1ebf 2001:db8::f816:3eff:fe31:1ebf
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.035 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap40cfa99d-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.035 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe1e573-a622-4548-8644-128c10c2b508]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.0401] device (tap15c82795-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.0412] device (tap15c82795-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.042 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:1e:bf 2001:db8:0:1:f816:3eff:fe31:1ebf 2001:db8::f816:3eff:fe31:1ebf'], port_security=['fa:16:3e:31:1e:bf 2001:db8:0:1:f816:3eff:fe31:1ebf 2001:db8::f816:3eff:fe31:1ebf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe31:1ebf/64 2001:db8::f816:3eff:fe31:1ebf/64', 'neutron:device_id': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15c82795-bc8d-4e4d-9949-ded082705cd7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.045 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[55c590e6-62f3-4b9c-9ef3-7dceb75aa8e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00636|binding|INFO|Setting lport 15c82795-bc8d-4e4d-9949-ded082705cd7 ovn-installed in OVS
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00637|binding|INFO|Setting lport 15c82795-bc8d-4e4d-9949-ded082705cd7 up in Southbound
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.0514] device (tap15d18429-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.0520] device (tap15d18429-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.056 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[817c4f68-a783-4b85-8175-52898c1de162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 systemd-machined[152794]: New machine qemu-80-instance-000000a7.
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.081 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3acd2e-425f-4fb3-ba5a-0e22106e0701]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 systemd[1]: Started Virtual Machine qemu-80-instance-000000a7.
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.112 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4bc9b8-d048-435e-a596-3bba3f7abf65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.117 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[392e4dcf-ca44-417c-a0aa-0cfde5a11281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.1203] manager: (tap40cfa99d-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/288)
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.151 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[cf53ca58-147f-41fb-9882-36ba6cc64762]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.154 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b9681b0d-7527-4115-885e-8fe23eb971c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.1767] device (tap40cfa99d-f0): carrier: link connected
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.183 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[4905963f-21b2-4f74-8ba6-e64b8ba6547a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.201 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc67252-a2ba-4d3d-83ed-37e95ccfc034]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40cfa99d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:29:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567680, 'reachable_time': 40118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246564, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.224 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[35f28ac5-e10e-4cc9-ba22-1b854a6a9b75]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe50:29b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567680, 'tstamp': 567680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246565, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.244 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4bd8a00e-e9e6-4189-bf8d-3093531e9405]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap40cfa99d-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:50:29:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 193], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567680, 'reachable_time': 40118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246566, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.293 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0349ebea-6671-439d-867d-5f31aa50fc4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.359 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e2573850-98d9-4ca9-b09f-27436edb0248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.361 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40cfa99d-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.361 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.361 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap40cfa99d-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.3750] manager: (tap40cfa99d-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 kernel: tap40cfa99d-f0: entered promiscuous mode
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.378 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap40cfa99d-f0, col_values=(('external_ids', {'iface-id': 'db554134-d733-46a3-ad79-d127cd6e8575'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 ovn_controller[94912]: 2025-09-30T21:49:41Z|00638|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.398 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.399 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8e16a36a-2f66-4397-adda-b5eb98675835]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.400 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.pid.haproxy
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 40cfa99d-fae5-4f7e-b4bc-e90e389ced61
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.400 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'env', 'PROCESS_TAG=haproxy-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/40cfa99d-fae5-4f7e-b4bc-e90e389ced61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.753 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268981.753103, 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.754 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] VM Started (Lifecycle Event)
Sep 30 21:49:41 compute-0 podman[246606]: 2025-09-30 21:49:41.778530442 +0000 UTC m=+0.066528063 container create dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.781 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.785 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268981.7533543, 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.786 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] VM Paused (Lifecycle Event)
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.805 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.809 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:41 compute-0 systemd[1]: Started libpod-conmon-dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a.scope.
Sep 30 21:49:41 compute-0 nova_compute[192810]: 2025-09-30 21:49:41.828 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:49:41 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:49:41 compute-0 podman[246606]: 2025-09-30 21:49:41.739129894 +0000 UTC m=+0.027127535 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:49:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29b24882af127fa18feb73554336878a681f9f54b0acfc3899285f57a6236ecd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:49:41 compute-0 podman[246606]: 2025-09-30 21:49:41.852469248 +0000 UTC m=+0.140466889 container init dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 21:49:41 compute-0 podman[246606]: 2025-09-30 21:49:41.857917273 +0000 UTC m=+0.145914894 container start dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, tcib_managed=true, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:49:41 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [NOTICE]   (246626) : New worker (246628) forked
Sep 30 21:49:41 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [NOTICE]   (246626) : Loading success.
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.904 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15c82795-bc8d-4e4d-9949-ded082705cd7 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 unbound from our chassis
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.906 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.917 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e23b90c5-9f2f-456e-8118-abae7f0d4c9e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.918 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1ec18dd-21 in ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.919 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1ec18dd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.920 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a6b1ede5-da40-464d-8588-4e1d62ccb634]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.920 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cce1fa29-23b8-4853-8350-68cf9968af8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.930 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[4cccd7f4-b730-4c96-b530-af92fd34efce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.951 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a424ca-a47b-42ad-9254-2cc4aaf45577]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.977 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[97f084ac-fa28-4453-837e-0b63d523a1dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:41.987 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9c899fc7-ffbd-47fd-b0e0-cb8acec253de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:41 compute-0 NetworkManager[51733]: <info>  [1759268981.9886] manager: (tapd1ec18dd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/290)
Sep 30 21:49:41 compute-0 systemd-udevd[246541]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.016 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c53e801d-e507-47f6-92d1-fa1778de7f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.019 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f47b71b3-bc84-4656-99cc-bb8245593d89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 NetworkManager[51733]: <info>  [1759268982.0415] device (tapd1ec18dd-20): carrier: link connected
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.046 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[69b47b95-1ead-464f-9f07-38b77ae55c19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.063 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f21cb4b5-9990-4158-bcab-6d563b20a6bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1ec18dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:a9:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567766, 'reachable_time': 20932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246649, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.078 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bd35876a-45d2-4be8-8abf-ac9c5976a37d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2c:a951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 567766, 'tstamp': 567766}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 246650, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.080 2 DEBUG nova.network.neutron [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updated VIF entry in instance network info cache for port 15c82795-bc8d-4e4d-9949-ded082705cd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.081 2 DEBUG nova.network.neutron [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.093 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9ddc3293-d5d0-4721-ab99-92a2f8007601]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1ec18dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2c:a9:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567766, 'reachable_time': 20932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 246651, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.102 2 DEBUG oslo_concurrency.lockutils [req-4b981dc7-6a58-4466-976a-9f85dbf0d030 req-a4c8608c-9468-4953-bbdb-b0a3317ee292 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.127 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[321a14e5-c4b5-4e8d-950c-db34dda339d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.169 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d70694d4-c196-4bdf-a119-bc5fb6bf6c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.170 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1ec18dd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.171 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.171 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1ec18dd-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:42 compute-0 NetworkManager[51733]: <info>  [1759268982.1736] manager: (tapd1ec18dd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/291)
Sep 30 21:49:42 compute-0 kernel: tapd1ec18dd-20: entered promiscuous mode
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.177 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1ec18dd-20, col_values=(('external_ids', {'iface-id': '4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:42 compute-0 ovn_controller[94912]: 2025-09-30T21:49:42Z|00639|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.181 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.182 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e44c67b2-a7a3-47aa-8f34-01b12d86753d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.183 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/d1ec18dd-20d4-4643-8e73-7d404d8b8493.pid.haproxy
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID d1ec18dd-20d4-4643-8e73-7d404d8b8493
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:49:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:42.184 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'env', 'PROCESS_TAG=haproxy-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1ec18dd-20d4-4643-8e73-7d404d8b8493.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:49:42 compute-0 nova_compute[192810]: 2025-09-30 21:49:42.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:42 compute-0 podman[246680]: 2025-09-30 21:49:42.558840276 +0000 UTC m=+0.051171562 container create 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:49:42 compute-0 systemd[1]: Started libpod-conmon-7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c.scope.
Sep 30 21:49:42 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:49:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1907327462c98f935560d97b63d7884bacd7bcb1b80fb9df4d38dca63178b823/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:49:42 compute-0 podman[246680]: 2025-09-30 21:49:42.527439306 +0000 UTC m=+0.019770612 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:49:42 compute-0 podman[246680]: 2025-09-30 21:49:42.632958786 +0000 UTC m=+0.125290092 container init 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:49:42 compute-0 podman[246680]: 2025-09-30 21:49:42.639608501 +0000 UTC m=+0.131939787 container start 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:49:42 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [NOTICE]   (246699) : New worker (246701) forked
Sep 30 21:49:42 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [NOTICE]   (246699) : Loading success.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.074 2 INFO nova.compute.manager [None req-7fa76c24-451b-4b92-9269-ca9334bdd807 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Get console output
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.081 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.281 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.281 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.282 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.282 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.282 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Processing event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.283 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.283 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.284 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.284 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.284 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No event matching network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 in dict_keys([('network-vif-plugged', '15c82795-bc8d-4e4d-9949-ded082705cd7')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.285 2 WARNING nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received unexpected event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 for instance with vm_state building and task_state spawning.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.285 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.285 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.286 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.286 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.286 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Processing event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.287 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.287 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.287 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.287 2 DEBUG oslo_concurrency.lockutils [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.288 2 DEBUG nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No waiting events found dispatching network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.288 2 WARNING nova.compute.manager [req-48488d19-a8ab-409a-adb2-763d55bcc2f5 req-0fff7cfd-c3a6-4d44-9c9b-c839dd95eedf dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received unexpected event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 for instance with vm_state building and task_state spawning.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.289 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.293 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759268983.292928, 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.293 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] VM Resumed (Lifecycle Event)
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.295 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.299 2 INFO nova.virt.libvirt.driver [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance spawned successfully.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.299 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.315 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.320 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.323 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.323 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.324 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.324 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.324 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.325 2 DEBUG nova.virt.libvirt.driver [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.350 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.394 2 INFO nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Took 13.10 seconds to spawn the instance on the hypervisor.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.395 2 DEBUG nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.482 2 INFO nova.compute.manager [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Took 13.76 seconds to build instance.
Sep 30 21:49:43 compute-0 nova_compute[192810]: 2025-09-30 21:49:43.503 2 DEBUG oslo_concurrency.lockutils [None req-a89b378d-6ff3-45af-a915-9f45395e7c99 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.080 2 DEBUG nova.compute.manager [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.081 2 DEBUG nova.compute.manager [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing instance network info cache due to event network-changed-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.081 2 DEBUG oslo_concurrency.lockutils [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.081 2 DEBUG oslo_concurrency.lockutils [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.081 2 DEBUG nova.network.neutron [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Refreshing network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.292 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.293 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.293 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.293 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.293 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.303 2 INFO nova.compute.manager [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Terminating instance
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.313 2 DEBUG nova.compute.manager [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:49:44 compute-0 kernel: tap2b51d8f4-bd (unregistering): left promiscuous mode
Sep 30 21:49:44 compute-0 NetworkManager[51733]: <info>  [1759268984.3353] device (tap2b51d8f4-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:49:44 compute-0 ovn_controller[94912]: 2025-09-30T21:49:44Z|00640|binding|INFO|Releasing lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a from this chassis (sb_readonly=0)
Sep 30 21:49:44 compute-0 ovn_controller[94912]: 2025-09-30T21:49:44Z|00641|binding|INFO|Setting lport 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a down in Southbound
Sep 30 21:49:44 compute-0 ovn_controller[94912]: 2025-09-30T21:49:44Z|00642|binding|INFO|Removing iface tap2b51d8f4-bd ovn-installed in OVS
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.357 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:fb:b1 10.100.0.13'], port_security=['fa:16:3e:8b:fb:b1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '6', 'neutron:security_group_ids': '877e3925-1208-4d78-a647-d6e9b6515df1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f06fb115-6230-4a0b-83c9-81667590decb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.358 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a in datapath dd58cd77-6f38-46a4-b4e3-b10538139b43 unbound from our chassis
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.359 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dd58cd77-6f38-46a4-b4e3-b10538139b43, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.360 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb7dc6a-0d48-4232-982a-dcc6cbe21eea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.360 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 namespace which is not needed anymore
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a4.scope: Deactivated successfully.
Sep 30 21:49:44 compute-0 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a4.scope: Consumed 13.879s CPU time.
Sep 30 21:49:44 compute-0 systemd-machined[152794]: Machine qemu-79-instance-000000a4 terminated.
Sep 30 21:49:44 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [NOTICE]   (246340) : haproxy version is 2.8.14-c23fe91
Sep 30 21:49:44 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [NOTICE]   (246340) : path to executable is /usr/sbin/haproxy
Sep 30 21:49:44 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [WARNING]  (246340) : Exiting Master process...
Sep 30 21:49:44 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [ALERT]    (246340) : Current worker (246342) exited with code 143 (Terminated)
Sep 30 21:49:44 compute-0 neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43[246336]: [WARNING]  (246340) : All workers exited. Exiting... (0)
Sep 30 21:49:44 compute-0 systemd[1]: libpod-87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe.scope: Deactivated successfully.
Sep 30 21:49:44 compute-0 podman[246734]: 2025-09-30 21:49:44.499822677 +0000 UTC m=+0.045768317 container died 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:49:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe-userdata-shm.mount: Deactivated successfully.
Sep 30 21:49:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-c8c5957afe77e61bb064daed4b454590bd4f8c4ac6922013efa49666d02fb832-merged.mount: Deactivated successfully.
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 podman[246734]: 2025-09-30 21:49:44.558432982 +0000 UTC m=+0.104378602 container cleanup 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:49:44 compute-0 systemd[1]: libpod-conmon-87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe.scope: Deactivated successfully.
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.580 2 INFO nova.virt.libvirt.driver [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Instance destroyed successfully.
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.581 2 DEBUG nova.objects.instance [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.595 2 DEBUG nova.virt.libvirt.vif [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:48:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1118195670',display_name='tempest-TestNetworkAdvancedServerOps-server-1118195670',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1118195670',id=164,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLDpYp0NCufFhMupTnpTCr4Vcpo04rzkSDzZOneKedO728EI3/G29UkyaAkIHtV3QtRBxeFQEO2d2Hne8odaDxHvM2zlyxvg24YgrhCa5Ls4Q4dyozrdXslYt9KnqD2nxg==',key_name='tempest-TestNetworkAdvancedServerOps-296871907',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:48:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-0reom9zn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:49:24Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.595 2 DEBUG nova.network.os_vif_util [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.596 2 DEBUG nova.network.os_vif_util [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.597 2 DEBUG os_vif [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2b51d8f4-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.604 2 INFO os_vif [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:fb:b1,bridge_name='br-int',has_traffic_filtering=True,id=2b51d8f4-bdd2-4223-8c9f-2bff82319a8a,network=Network(dd58cd77-6f38-46a4-b4e3-b10538139b43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2b51d8f4-bd')
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.604 2 INFO nova.virt.libvirt.driver [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Deleting instance files /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8_del
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.605 2 INFO nova.virt.libvirt.driver [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Deletion of /var/lib/nova/instances/a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8_del complete
Sep 30 21:49:44 compute-0 podman[246780]: 2025-09-30 21:49:44.648442877 +0000 UTC m=+0.060757830 container remove 87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.652 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0bcb53d1-9ac4-4a4c-84c5-fd6b6290adae]: (4, ('Tue Sep 30 09:49:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 (87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe)\n87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe\nTue Sep 30 09:49:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 (87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe)\n87dedb8a0298589ff19c51a987dca7741032e2f4c05111eeb79c20d2d5e9ebbe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.653 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[17d1838b-4127-441b-894f-693d8edadb85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.654 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd58cd77-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:44 compute-0 kernel: tapdd58cd77-60: left promiscuous mode
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.675 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[27483777-7360-438c-a748-d721211d9283]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.687 2 INFO nova.compute.manager [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.688 2 DEBUG oslo.service.loopingcall [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.688 2 DEBUG nova.compute.manager [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:49:44 compute-0 nova_compute[192810]: 2025-09-30 21:49:44.688 2 DEBUG nova.network.neutron [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.703 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b893f844-3c85-4504-9a31-4928b8328e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.703 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e28d83a1-4b34-44e9-931f-c428a25fbecb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.720 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4c15e6-67ca-45aa-a40f-6f601907af4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 565879, 'reachable_time': 23447, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246794, 'error': None, 'target': 'ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:44 compute-0 systemd[1]: run-netns-ovnmeta\x2ddd58cd77\x2d6f38\x2d46a4\x2db4e3\x2db10538139b43.mount: Deactivated successfully.
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.722 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dd58cd77-6f38-46a4-b4e3-b10538139b43 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:49:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:44.722 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[979586c0-ac1c-43ae-babd-05c9e843cb80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.226 2 DEBUG nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.227 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.227 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.228 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.228 2 DEBUG nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.228 2 DEBUG nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-unplugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.229 2 DEBUG nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.229 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.229 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.230 2 DEBUG oslo_concurrency.lockutils [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.230 2 DEBUG nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] No waiting events found dispatching network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.230 2 WARNING nova.compute.manager [req-fe49ac6c-f07d-4a9b-bc65-923cc884dc3b req-5d9bdd6f-1055-4c09-a389-f8d3449e78d3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received unexpected event network-vif-plugged-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a for instance with vm_state active and task_state deleting.
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.323 2 DEBUG nova.network.neutron [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.343 2 INFO nova.compute.manager [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Took 1.65 seconds to deallocate network for instance.
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.420 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.422 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.456 2 DEBUG nova.compute.manager [req-2b402d56-1e0a-443c-8696-90ea71aa5647 req-35c28eae-b553-46a6-b850-3eda5a402c34 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Received event network-vif-deleted-2b51d8f4-bdd2-4223-8c9f-2bff82319a8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.487 2 DEBUG nova.network.neutron [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updated VIF entry in instance network info cache for port 2b51d8f4-bdd2-4223-8c9f-2bff82319a8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.488 2 DEBUG nova.network.neutron [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Updating instance_info_cache with network_info: [{"id": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "address": "fa:16:3e:8b:fb:b1", "network": {"id": "dd58cd77-6f38-46a4-b4e3-b10538139b43", "bridge": "br-int", "label": "tempest-network-smoke--1535188764", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2b51d8f4-bd", "ovs_interfaceid": "2b51d8f4-bdd2-4223-8c9f-2bff82319a8a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.516 2 DEBUG oslo_concurrency.lockutils [req-e0f5ef77-fba5-4dc4-a2a6-457174779d8e req-a540aac2-3538-4505-a637-46bc9e66a420 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.527 2 DEBUG nova.compute.provider_tree [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.544 2 DEBUG nova.scheduler.client.report [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.564 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.583 2 INFO nova.scheduler.client.report [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.681 2 DEBUG oslo_concurrency.lockutils [None req-f40d3998-42e5-4b52-bb64-617ac3befc04 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:46 compute-0 ovn_controller[94912]: 2025-09-30T21:49:46Z|00643|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:49:46 compute-0 ovn_controller[94912]: 2025-09-30T21:49:46Z|00644|binding|INFO|Releasing lport 1dad0101-af5c-4580-b216-17a88dc97c78 from this chassis (sb_readonly=0)
Sep 30 21:49:46 compute-0 ovn_controller[94912]: 2025-09-30T21:49:46Z|00645|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:49:46 compute-0 nova_compute[192810]: 2025-09-30 21:49:46.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.385 2 DEBUG nova.compute.manager [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.385 2 DEBUG nova.compute.manager [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing instance network info cache due to event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.386 2 DEBUG oslo_concurrency.lockutils [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.386 2 DEBUG oslo_concurrency.lockutils [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:48 compute-0 nova_compute[192810]: 2025-09-30 21:49:48.386 2 DEBUG nova.network.neutron [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing network info cache for port 15d18429-a32d-4645-bc6f-dcdc238c5de9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:49 compute-0 nova_compute[192810]: 2025-09-30 21:49:49.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:49 compute-0 nova_compute[192810]: 2025-09-30 21:49:49.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:49 compute-0 nova_compute[192810]: 2025-09-30 21:49:49.904 2 DEBUG nova.network.neutron [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updated VIF entry in instance network info cache for port 15d18429-a32d-4645-bc6f-dcdc238c5de9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:49 compute-0 nova_compute[192810]: 2025-09-30 21:49:49.905 2 DEBUG nova.network.neutron [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:49 compute-0 nova_compute[192810]: 2025-09-30 21:49:49.935 2 DEBUG oslo_concurrency.lockutils [req-06882eaa-6ffb-48eb-9db9-f6d5f5c4a7e0 req-4c071e49-21eb-4a52-af74-e8a8c58f9289 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:50 compute-0 podman[246798]: 2025-09-30 21:49:50.343343402 +0000 UTC m=+0.072557583 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:49:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:50.348 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:50.349 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:49:50 compute-0 nova_compute[192810]: 2025-09-30 21:49:50.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:50 compute-0 podman[246797]: 2025-09-30 21:49:50.404611334 +0000 UTC m=+0.132850280 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:49:50 compute-0 podman[246841]: 2025-09-30 21:49:50.481269077 +0000 UTC m=+0.061768624 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Sep 30 21:49:50 compute-0 ovn_controller[94912]: 2025-09-30T21:49:50Z|00646|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:49:50 compute-0 ovn_controller[94912]: 2025-09-30T21:49:50Z|00647|binding|INFO|Releasing lport 1dad0101-af5c-4580-b216-17a88dc97c78 from this chassis (sb_readonly=0)
Sep 30 21:49:50 compute-0 ovn_controller[94912]: 2025-09-30T21:49:50Z|00648|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:49:50 compute-0 nova_compute[192810]: 2025-09-30 21:49:50.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:53 compute-0 nova_compute[192810]: 2025-09-30 21:49:53.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:54 compute-0 nova_compute[192810]: 2025-09-30 21:49:54.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:54 compute-0 nova_compute[192810]: 2025-09-30 21:49:54.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:55 compute-0 nova_compute[192810]: 2025-09-30 21:49:55.995 2 DEBUG nova.compute.manager [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:55 compute-0 nova_compute[192810]: 2025-09-30 21:49:55.995 2 DEBUG nova.compute.manager [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing instance network info cache due to event network-changed-9a0264ee-cc33-48a8-b015-36ebc5bfdd43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:49:55 compute-0 nova_compute[192810]: 2025-09-30 21:49:55.996 2 DEBUG oslo_concurrency.lockutils [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:49:55 compute-0 nova_compute[192810]: 2025-09-30 21:49:55.996 2 DEBUG oslo_concurrency.lockutils [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:49:55 compute-0 nova_compute[192810]: 2025-09-30 21:49:55.997 2 DEBUG nova.network.neutron [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Refreshing network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.109 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.109 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.110 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.110 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.110 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.123 2 INFO nova.compute.manager [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Terminating instance
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.139 2 DEBUG nova.compute.manager [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:49:56 compute-0 kernel: tap9a0264ee-cc (unregistering): left promiscuous mode
Sep 30 21:49:56 compute-0 NetworkManager[51733]: <info>  [1759268996.1671] device (tap9a0264ee-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 ovn_controller[94912]: 2025-09-30T21:49:56Z|00649|binding|INFO|Releasing lport 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 from this chassis (sb_readonly=0)
Sep 30 21:49:56 compute-0 ovn_controller[94912]: 2025-09-30T21:49:56Z|00650|binding|INFO|Setting lport 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 down in Southbound
Sep 30 21:49:56 compute-0 ovn_controller[94912]: 2025-09-30T21:49:56Z|00651|binding|INFO|Removing iface tap9a0264ee-cc ovn-installed in OVS
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.183 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6f:cf:2b 10.100.0.9'], port_security=['fa:16:3e:6f:cf:2b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '4ee4a775-05d1-45fc-b4f9-566ab8159710', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-62012511-a944-47e7-b858-e59eabaf741d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1ff42902541948f7a6df344fac87c2b7', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1e439ed0-2f15-472a-8dfd-6e6fb79309ff 6cc48b7c-7cab-41b4-b3d1-c55036450f25', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=818bfa61-11e5-4d95-929c-1b2d08c53e7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=9a0264ee-cc33-48a8-b015-36ebc5bfdd43) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.184 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43 in datapath 62012511-a944-47e7-b858-e59eabaf741d unbound from our chassis
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.185 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 62012511-a944-47e7-b858-e59eabaf741d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.188 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a8873eb4-9369-4f5b-97f5-aee1cd529b10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.189 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-62012511-a944-47e7-b858-e59eabaf741d namespace which is not needed anymore
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Sep 30 21:49:56 compute-0 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Consumed 16.215s CPU time.
Sep 30 21:49:56 compute-0 systemd-machined[152794]: Machine qemu-77-instance-000000a2 terminated.
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [NOTICE]   (245733) : haproxy version is 2.8.14-c23fe91
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [NOTICE]   (245733) : path to executable is /usr/sbin/haproxy
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [WARNING]  (245733) : Exiting Master process...
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [WARNING]  (245733) : Exiting Master process...
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [ALERT]    (245733) : Current worker (245735) exited with code 143 (Terminated)
Sep 30 21:49:56 compute-0 neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d[245729]: [WARNING]  (245733) : All workers exited. Exiting... (0)
Sep 30 21:49:56 compute-0 systemd[1]: libpod-7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611.scope: Deactivated successfully.
Sep 30 21:49:56 compute-0 podman[246903]: 2025-09-30 21:49:56.335643061 +0000 UTC m=+0.046840714 container died 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:49:56 compute-0 ovn_controller[94912]: 2025-09-30T21:49:56Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:0a:27 10.100.0.9
Sep 30 21:49:56 compute-0 ovn_controller[94912]: 2025-09-30T21:49:56Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:0a:27 10.100.0.9
Sep 30 21:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611-userdata-shm.mount: Deactivated successfully.
Sep 30 21:49:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-4e51f8436c345a32ba7a85d9912b0f31aba28ac7e78ab022322e8ff5e9e272fb-merged.mount: Deactivated successfully.
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 podman[246903]: 2025-09-30 21:49:56.371837229 +0000 UTC m=+0.083034872 container cleanup 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 systemd[1]: libpod-conmon-7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611.scope: Deactivated successfully.
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.406 2 INFO nova.virt.libvirt.driver [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance destroyed successfully.
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.406 2 DEBUG nova.objects.instance [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lazy-loading 'resources' on Instance uuid 4ee4a775-05d1-45fc-b4f9-566ab8159710 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.434 2 DEBUG nova.virt.libvirt.vif [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:48:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2108116341-access_point-1704200209',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2108116341-ac',id=162,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP8+ySuIworn5+VdMqrwf2ZDaHL6j+oMxWlSGL0WjntH9xtXyOZttwOPpv9QtW35sVLgwjU06yIyW6DTo/xQcotOSzqfnBCevinC2EH2rKNVNrC+n51UyDfRK5Jp7gOR1A==',key_name='tempest-TestSecurityGroupsBasicOps-670853740',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:48:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1ff42902541948f7a6df344fac87c2b7',ramdisk_id='',reservation_id='r-w00y4yxk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2108116341',owner_user_name='tempest-TestSecurityGroupsBasicOps-2108116341-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:48:44Z,user_data=None,user_id='c33a752ef8234bba917ace1e73763490',uuid=4ee4a775-05d1-45fc-b4f9-566ab8159710,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.435 2 DEBUG nova.network.os_vif_util [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converting VIF {"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:49:56 compute-0 podman[246940]: 2025-09-30 21:49:56.43630241 +0000 UTC m=+0.040034625 container remove 7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.436 2 DEBUG nova.network.os_vif_util [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.436 2 DEBUG os_vif [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.439 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a0264ee-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.441 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[df72db7a-32bb-47ac-a9e0-a36ee2ed8441]: (4, ('Tue Sep 30 09:49:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d (7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611)\n7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611\nTue Sep 30 09:49:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-62012511-a944-47e7-b858-e59eabaf741d (7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611)\n7a851eda03a0ee4834a8c43db0ec8531a6a7c80e79b70a947ad00ab3b8558611\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.444 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1abb46-2d8a-48a9-b0df-f5943622af9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.444 2 INFO os_vif [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6f:cf:2b,bridge_name='br-int',has_traffic_filtering=True,id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43,network=Network(62012511-a944-47e7-b858-e59eabaf741d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9a0264ee-cc')
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.444 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62012511-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.445 2 INFO nova.virt.libvirt.driver [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Deleting instance files /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710_del
Sep 30 21:49:56 compute-0 kernel: tap62012511-a0: left promiscuous mode
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.446 2 INFO nova.virt.libvirt.driver [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Deletion of /var/lib/nova/instances/4ee4a775-05d1-45fc-b4f9-566ab8159710_del complete
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.462 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c14d4d64-2cdc-4529-b10c-b07211f95295]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.501 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a9be69-3b7a-4366-ac29-62da2062679f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.502 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d357d917-104d-40d1-9bc5-66586d64b766]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.515 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2e1a58-dfdc-456c-90cf-60da676cb067]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 561927, 'reachable_time': 36519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 246961, 'error': None, 'target': 'ovnmeta-62012511-a944-47e7-b858-e59eabaf741d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.518 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-62012511-a944-47e7-b858-e59eabaf741d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:49:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:49:56.518 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[db679cbb-e23e-4d76-8458-0190d9b7ad30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:49:56 compute-0 systemd[1]: run-netns-ovnmeta\x2d62012511\x2da944\x2d47e7\x2db858\x2de59eabaf741d.mount: Deactivated successfully.
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.521 2 INFO nova.compute.manager [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.522 2 DEBUG oslo.service.loopingcall [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.522 2 DEBUG nova.compute.manager [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.523 2 DEBUG nova.network.neutron [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.952 2 DEBUG nova.compute.manager [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-unplugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.952 2 DEBUG oslo_concurrency.lockutils [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.952 2 DEBUG oslo_concurrency.lockutils [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.952 2 DEBUG oslo_concurrency.lockutils [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.953 2 DEBUG nova.compute.manager [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] No waiting events found dispatching network-vif-unplugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:56 compute-0 nova_compute[192810]: 2025-09-30 21:49:56.953 2 DEBUG nova.compute.manager [req-bf99fdf7-594d-4490-8d3f-c3667a6a7893 req-02b9c66a-b119-4f8e-a58c-ac664631f629 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-unplugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.199 2 DEBUG nova.network.neutron [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.234 2 INFO nova.compute.manager [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Took 0.71 seconds to deallocate network for instance.
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.311 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.311 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.389 2 DEBUG nova.compute.provider_tree [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.404 2 DEBUG nova.scheduler.client.report [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.429 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.435 2 DEBUG nova.network.neutron [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updated VIF entry in instance network info cache for port 9a0264ee-cc33-48a8-b015-36ebc5bfdd43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.436 2 DEBUG nova.network.neutron [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Updating instance_info_cache with network_info: [{"id": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "address": "fa:16:3e:6f:cf:2b", "network": {"id": "62012511-a944-47e7-b858-e59eabaf741d", "bridge": "br-int", "label": "tempest-network-smoke--1420028275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1ff42902541948f7a6df344fac87c2b7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9a0264ee-cc", "ovs_interfaceid": "9a0264ee-cc33-48a8-b015-36ebc5bfdd43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.462 2 DEBUG oslo_concurrency.lockutils [req-36ae8849-dca2-424d-bdf8-13a78ca65824 req-6a3864db-4913-4a27-b264-324aa1dce2e4 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-4ee4a775-05d1-45fc-b4f9-566ab8159710" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.473 2 INFO nova.scheduler.client.report [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Deleted allocations for instance 4ee4a775-05d1-45fc-b4f9-566ab8159710
Sep 30 21:49:57 compute-0 nova_compute[192810]: 2025-09-30 21:49:57.559 2 DEBUG oslo_concurrency.lockutils [None req-d226a579-249e-4607-8c88-7e6b852a53ff c33a752ef8234bba917ace1e73763490 1ff42902541948f7a6df344fac87c2b7 - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:58 compute-0 nova_compute[192810]: 2025-09-30 21:49:58.104 2 DEBUG nova.compute.manager [req-02ebc4e8-3c8b-45f8-8018-b52259b55563 req-76938283-79e4-45dd-9b35-4cf99ac93831 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-deleted-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:58 compute-0 nova_compute[192810]: 2025-09-30 21:49:58.104 2 INFO nova.compute.manager [req-02ebc4e8-3c8b-45f8-8018-b52259b55563 req-76938283-79e4-45dd-9b35-4cf99ac93831 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Neutron deleted interface 9a0264ee-cc33-48a8-b015-36ebc5bfdd43; detaching it from the instance and deleting it from the info cache
Sep 30 21:49:58 compute-0 nova_compute[192810]: 2025-09-30 21:49:58.104 2 DEBUG nova.network.neutron [req-02ebc4e8-3c8b-45f8-8018-b52259b55563 req-76938283-79e4-45dd-9b35-4cf99ac93831 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Sep 30 21:49:58 compute-0 nova_compute[192810]: 2025-09-30 21:49:58.106 2 DEBUG nova.compute.manager [req-02ebc4e8-3c8b-45f8-8018-b52259b55563 req-76938283-79e4-45dd-9b35-4cf99ac93831 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Detach interface failed, port_id=9a0264ee-cc33-48a8-b015-36ebc5bfdd43, reason: Instance 4ee4a775-05d1-45fc-b4f9-566ab8159710 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:49:58 compute-0 nova_compute[192810]: 2025-09-30 21:49:58.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.057 2 DEBUG nova.compute.manager [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.058 2 DEBUG oslo_concurrency.lockutils [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.058 2 DEBUG oslo_concurrency.lockutils [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.058 2 DEBUG oslo_concurrency.lockutils [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "4ee4a775-05d1-45fc-b4f9-566ab8159710-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.058 2 DEBUG nova.compute.manager [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] No waiting events found dispatching network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.058 2 WARNING nova.compute.manager [req-813cebdf-12fb-461d-a67b-f2a8bb4c5486 req-0dd55b93-6580-4456-924b-d67a3d7986b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Received unexpected event network-vif-plugged-9a0264ee-cc33-48a8-b015-36ebc5bfdd43 for instance with vm_state deleted and task_state None.
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:49:59 compute-0 podman[246962]: 2025-09-30 21:49:59.321836982 +0000 UTC m=+0.060581075 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:49:59 compute-0 podman[246963]: 2025-09-30 21:49:59.349346165 +0000 UTC m=+0.077074594 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.579 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268984.5783608, a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.579 2 INFO nova.compute.manager [-] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] VM Stopped (Lifecycle Event)
Sep 30 21:49:59 compute-0 nova_compute[192810]: 2025-09-30 21:49:59.597 2 DEBUG nova.compute.manager [None req-acd79bd9-f56d-4f2b-ab29-6ee1e2f87e21 - - - - - -] [instance: a36c68a5-9a53-4c6f-a82c-c2c0ee4d6ba8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:00.350 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:00 compute-0 ovn_controller[94912]: 2025-09-30T21:50:00Z|00652|binding|INFO|Releasing lport 4d7fdd9a-d25a-4be0-8653-dd976ce2c1d5 from this chassis (sb_readonly=0)
Sep 30 21:50:00 compute-0 ovn_controller[94912]: 2025-09-30T21:50:00Z|00653|binding|INFO|Releasing lport db554134-d733-46a3-ad79-d127cd6e8575 from this chassis (sb_readonly=0)
Sep 30 21:50:00 compute-0 nova_compute[192810]: 2025-09-30 21:50:00.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:00 compute-0 nova_compute[192810]: 2025-09-30 21:50:00.959 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:01 compute-0 nova_compute[192810]: 2025-09-30 21:50:01.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:02 compute-0 nova_compute[192810]: 2025-09-30 21:50:02.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:02 compute-0 nova_compute[192810]: 2025-09-30 21:50:02.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:50:03 compute-0 nova_compute[192810]: 2025-09-30 21:50:03.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:03 compute-0 nova_compute[192810]: 2025-09-30 21:50:03.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:03 compute-0 nova_compute[192810]: 2025-09-30 21:50:03.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:04 compute-0 unix_chkpwd[247006]: password check failed for user (root)
Sep 30 21:50:04 compute-0 sshd-session[246878]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:05 compute-0 nova_compute[192810]: 2025-09-30 21:50:05.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:05 compute-0 nova_compute[192810]: 2025-09-30 21:50:05.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.358 2 DEBUG nova.compute.manager [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.358 2 DEBUG nova.compute.manager [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing instance network info cache due to event network-changed-15d18429-a32d-4645-bc6f-dcdc238c5de9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.358 2 DEBUG oslo_concurrency.lockutils [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.358 2 DEBUG oslo_concurrency.lockutils [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.359 2 DEBUG nova.network.neutron [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Refreshing network info cache for port 15d18429-a32d-4645-bc6f-dcdc238c5de9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:06 compute-0 sshd-session[246878]: Failed password for root from 8.210.178.40 port 53462 ssh2
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.517 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.517 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.517 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.517 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.518 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.528 2 INFO nova.compute.manager [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Terminating instance
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.541 2 DEBUG nova.compute.manager [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:50:06 compute-0 kernel: tap15d18429-a3 (unregistering): left promiscuous mode
Sep 30 21:50:06 compute-0 NetworkManager[51733]: <info>  [1759269006.5712] device (tap15d18429-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00654|binding|INFO|Releasing lport 15d18429-a32d-4645-bc6f-dcdc238c5de9 from this chassis (sb_readonly=0)
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00655|binding|INFO|Setting lport 15d18429-a32d-4645-bc6f-dcdc238c5de9 down in Southbound
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00656|binding|INFO|Removing iface tap15d18429-a3 ovn-installed in OVS
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.593 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:0a:27 10.100.0.9'], port_security=['fa:16:3e:23:0a:27 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfb1184d-a559-4543-ae06-e2b48bcfb2c3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15d18429-a32d-4645-bc6f-dcdc238c5de9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.595 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15d18429-a32d-4645-bc6f-dcdc238c5de9 in datapath 40cfa99d-fae5-4f7e-b4bc-e90e389ced61 unbound from our chassis
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.596 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 40cfa99d-fae5-4f7e-b4bc-e90e389ced61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.596 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c575c3c7-022b-41b3-8fe2-4417e76daa37]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.597 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 namespace which is not needed anymore
Sep 30 21:50:06 compute-0 kernel: tap15c82795-bc (unregistering): left promiscuous mode
Sep 30 21:50:06 compute-0 NetworkManager[51733]: <info>  [1759269006.6060] device (tap15c82795-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00657|binding|INFO|Releasing lport 15c82795-bc8d-4e4d-9949-ded082705cd7 from this chassis (sb_readonly=0)
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00658|binding|INFO|Setting lport 15c82795-bc8d-4e4d-9949-ded082705cd7 down in Southbound
Sep 30 21:50:06 compute-0 ovn_controller[94912]: 2025-09-30T21:50:06Z|00659|binding|INFO|Removing iface tap15c82795-bc ovn-installed in OVS
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.625 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:1e:bf 2001:db8:0:1:f816:3eff:fe31:1ebf 2001:db8::f816:3eff:fe31:1ebf'], port_security=['fa:16:3e:31:1e:bf 2001:db8:0:1:f816:3eff:fe31:1ebf 2001:db8::f816:3eff:fe31:1ebf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe31:1ebf/64 2001:db8::f816:3eff:fe31:1ebf/64', 'neutron:device_id': '2f0fbc4b-01ff-4422-bed8-aa7ca8934f51', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '272ca623-2e10-4dc9-b9ff-e2fd55c61f6d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5d03d334-4f0d-47df-a94a-1c647c7026cb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=15c82795-bc8d-4e4d-9949-ded082705cd7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a7.scope: Deactivated successfully.
Sep 30 21:50:06 compute-0 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000a7.scope: Consumed 12.649s CPU time.
Sep 30 21:50:06 compute-0 systemd-machined[152794]: Machine qemu-80-instance-000000a7 terminated.
Sep 30 21:50:06 compute-0 podman[247012]: 2025-09-30 21:50:06.668601369 +0000 UTC m=+0.067092327 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:50:06 compute-0 podman[247008]: 2025-09-30 21:50:06.678505435 +0000 UTC m=+0.085422592 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:06 compute-0 podman[247011]: 2025-09-30 21:50:06.706656734 +0000 UTC m=+0.106157267 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:50:06 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [NOTICE]   (246626) : haproxy version is 2.8.14-c23fe91
Sep 30 21:50:06 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [NOTICE]   (246626) : path to executable is /usr/sbin/haproxy
Sep 30 21:50:06 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [WARNING]  (246626) : Exiting Master process...
Sep 30 21:50:06 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [ALERT]    (246626) : Current worker (246628) exited with code 143 (Terminated)
Sep 30 21:50:06 compute-0 neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61[246622]: [WARNING]  (246626) : All workers exited. Exiting... (0)
Sep 30 21:50:06 compute-0 systemd[1]: libpod-dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a.scope: Deactivated successfully.
Sep 30 21:50:06 compute-0 podman[247093]: 2025-09-30 21:50:06.736241018 +0000 UTC m=+0.043480680 container died dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a-userdata-shm.mount: Deactivated successfully.
Sep 30 21:50:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-29b24882af127fa18feb73554336878a681f9f54b0acfc3899285f57a6236ecd-merged.mount: Deactivated successfully.
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 NetworkManager[51733]: <info>  [1759269006.7749] manager: (tap15c82795-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/292)
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 podman[247093]: 2025-09-30 21:50:06.776892888 +0000 UTC m=+0.084132590 container cleanup dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 systemd[1]: libpod-conmon-dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a.scope: Deactivated successfully.
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.826 2 INFO nova.virt.libvirt.driver [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Instance destroyed successfully.
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.826 2 DEBUG nova.objects.instance [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.838 2 DEBUG nova.virt.libvirt.vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:49:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:49:43Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.839 2 DEBUG nova.network.os_vif_util [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.839 2 DEBUG nova.network.os_vif_util [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.839 2 DEBUG os_vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15d18429-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.849 2 INFO os_vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:0a:27,bridge_name='br-int',has_traffic_filtering=True,id=15d18429-a32d-4645-bc6f-dcdc238c5de9,network=Network(40cfa99d-fae5-4f7e-b4bc-e90e389ced61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15d18429-a3')
Sep 30 21:50:06 compute-0 podman[247140]: 2025-09-30 21:50:06.850482575 +0000 UTC m=+0.046339192 container remove dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.850 2 DEBUG nova.virt.libvirt.vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:49:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1303425693',display_name='tempest-TestGettingAddress-server-1303425693',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1303425693',id=167,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCGG0QbUCZfF/aMNgREbUjMb15nHLUKy3cMsM6riMQ1TEklisidkwt9SMk4vpWErg6fFTJOsgBrfd7Y56loMtMOFXQViH6JvZSbYiUd68BEKiwfiEj6LoMHEURYI7Qr6GQ==',key_name='tempest-TestGettingAddress-1486288610',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:49:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-n1t0ecif',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:49:43Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=2f0fbc4b-01ff-4422-bed8-aa7ca8934f51,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.850 2 DEBUG nova.network.os_vif_util [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.851 2 DEBUG nova.network.os_vif_util [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.851 2 DEBUG os_vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.852 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15c82795-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.856 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ca64d2-84f3-4b1f-ae34-4e10a8a9994b]: (4, ('Tue Sep 30 09:50:06 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 (dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a)\ndce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a\nTue Sep 30 09:50:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 (dce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a)\ndce7e999241520a5226f1799e3e9c077192a6f45f700150b7741554f2ab3e81a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.858 2 INFO os_vif [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:1e:bf,bridge_name='br-int',has_traffic_filtering=True,id=15c82795-bc8d-4e4d-9949-ded082705cd7,network=Network(d1ec18dd-20d4-4643-8e73-7d404d8b8493),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15c82795-bc')
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.859 2 INFO nova.virt.libvirt.driver [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Deleting instance files /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51_del
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.859 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[57026780-6d18-4b0a-83da-5abb064bb46a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.860 2 INFO nova.virt.libvirt.driver [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Deletion of /var/lib/nova/instances/2f0fbc4b-01ff-4422-bed8-aa7ca8934f51_del complete
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.861 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap40cfa99d-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:06 compute-0 kernel: tap40cfa99d-f0: left promiscuous mode
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.877 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[52e89f4e-b705-46a0-8627-4bedc1bbfc9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.909 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ae54b2-4cbf-4206-b490-f4a812152661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.910 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d55c558b-24cb-437a-9c5a-a0903479df7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.922 2 DEBUG nova.compute.manager [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-unplugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.922 2 DEBUG oslo_concurrency.lockutils [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.922 2 DEBUG oslo_concurrency.lockutils [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.922 2 DEBUG oslo_concurrency.lockutils [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.923 2 DEBUG nova.compute.manager [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No waiting events found dispatching network-vif-unplugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.923 2 DEBUG nova.compute.manager [req-02372a08-5585-4a56-b609-a8029793cd55 req-faa51094-8b03-44b1-8176-b1fef0e6be87 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-unplugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.926 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[878ea447-1d97-424f-b760-5b885a69eb30]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567672, 'reachable_time': 19524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247161, 'error': None, 'target': 'ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.928 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-40cfa99d-fae5-4f7e-b4bc-e90e389ced61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.928 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[cd39d05f-bedc-4dc6-8622-c4ff54e59fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.929 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 15c82795-bc8d-4e4d-9949-ded082705cd7 in datapath d1ec18dd-20d4-4643-8e73-7d404d8b8493 unbound from our chassis
Sep 30 21:50:06 compute-0 systemd[1]: run-netns-ovnmeta\x2d40cfa99d\x2dfae5\x2d4f7e\x2db4bc\x2de90e389ced61.mount: Deactivated successfully.
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.930 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1ec18dd-20d4-4643-8e73-7d404d8b8493, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.932 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd20f23-18cd-4aab-addf-0affb460df2b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:06.932 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 namespace which is not needed anymore
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.936 2 INFO nova.compute.manager [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Took 0.40 seconds to destroy the instance on the hypervisor.
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.937 2 DEBUG oslo.service.loopingcall [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.938 2 DEBUG nova.compute.manager [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:50:06 compute-0 nova_compute[192810]: 2025-09-30 21:50:06.938 2 DEBUG nova.network.neutron [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:50:07 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [NOTICE]   (246699) : haproxy version is 2.8.14-c23fe91
Sep 30 21:50:07 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [NOTICE]   (246699) : path to executable is /usr/sbin/haproxy
Sep 30 21:50:07 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [WARNING]  (246699) : Exiting Master process...
Sep 30 21:50:07 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [ALERT]    (246699) : Current worker (246701) exited with code 143 (Terminated)
Sep 30 21:50:07 compute-0 neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493[246695]: [WARNING]  (246699) : All workers exited. Exiting... (0)
Sep 30 21:50:07 compute-0 systemd[1]: libpod-7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c.scope: Deactivated successfully.
Sep 30 21:50:07 compute-0 podman[247179]: 2025-09-30 21:50:07.072320613 +0000 UTC m=+0.048501766 container died 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c-userdata-shm.mount: Deactivated successfully.
Sep 30 21:50:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-1907327462c98f935560d97b63d7884bacd7bcb1b80fb9df4d38dca63178b823-merged.mount: Deactivated successfully.
Sep 30 21:50:07 compute-0 podman[247179]: 2025-09-30 21:50:07.101464266 +0000 UTC m=+0.077645419 container cleanup 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:07 compute-0 systemd[1]: libpod-conmon-7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c.scope: Deactivated successfully.
Sep 30 21:50:07 compute-0 podman[247211]: 2025-09-30 21:50:07.168426239 +0000 UTC m=+0.044964468 container remove 7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.173 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[47a11875-2bf5-4fbf-af98-e0cfaf849e28]: (4, ('Tue Sep 30 09:50:07 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 (7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c)\n7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c\nTue Sep 30 09:50:07 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 (7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c)\n7818b4e72b6eb562b8ee12d911fa8397099d737f8fa5c55ebe8b289e5b588e9c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.174 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d95bd6a5-45c3-4a0a-9351-bab1e487ea64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.175 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1ec18dd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:07 compute-0 kernel: tapd1ec18dd-20: left promiscuous mode
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.190 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f050cd28-60b2-4bf9-98e7-be3d33f381ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.217 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[773076c9-a095-454b-abc5-0924208391e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.218 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[817fa778-9f09-4913-88d1-268e15795a6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.233 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c4fa0c-90bc-4bbc-9359-23a93c299d94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 567759, 'reachable_time': 40310, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247226, 'error': None, 'target': 'ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.234 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1ec18dd-20d4-4643-8e73-7d404d8b8493 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:50:07 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:07.234 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[266b7f79-ee62-404f-a404-a434fc76b0e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:07 compute-0 systemd[1]: run-netns-ovnmeta\x2dd1ec18dd\x2d20d4\x2d4643\x2d8e73\x2d7d404d8b8493.mount: Deactivated successfully.
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.811 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.832 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.832 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.832 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.845 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.846 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:50:07 compute-0 nova_compute[192810]: 2025-09-30 21:50:07.846 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.185 2 DEBUG nova.network.neutron [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updated VIF entry in instance network info cache for port 15d18429-a32d-4645-bc6f-dcdc238c5de9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.186 2 DEBUG nova.network.neutron [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [{"id": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "address": "fa:16:3e:23:0a:27", "network": {"id": "40cfa99d-fae5-4f7e-b4bc-e90e389ced61", "bridge": "br-int", "label": "tempest-network-smoke--804531616", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15d18429-a3", "ovs_interfaceid": "15d18429-a32d-4645-bc6f-dcdc238c5de9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "15c82795-bc8d-4e4d-9949-ded082705cd7", "address": "fa:16:3e:31:1e:bf", "network": {"id": "d1ec18dd-20d4-4643-8e73-7d404d8b8493", "bridge": "br-int", "label": "tempest-network-smoke--678224572", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe31:1ebf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15c82795-bc", "ovs_interfaceid": "15c82795-bc8d-4e4d-9949-ded082705cd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.228 2 DEBUG oslo_concurrency.lockutils [req-1dcfef04-3602-4b76-8e48-a62e55065e0e req-762bc9ad-86ec-4fec-a3f3-3933be6ad53d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.279 2 DEBUG nova.network.neutron [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.297 2 INFO nova.compute.manager [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Took 1.36 seconds to deallocate network for instance.
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.363 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.364 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.488 2 DEBUG nova.compute.manager [req-86396482-b297-444d-9404-98736005ab65 req-91b1850b-c3ad-4e92-a6c1-73ac681e880a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-deleted-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.489 2 DEBUG nova.compute.manager [req-86396482-b297-444d-9404-98736005ab65 req-91b1850b-c3ad-4e92-a6c1-73ac681e880a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-deleted-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.591 2 DEBUG nova.compute.provider_tree [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.610 2 DEBUG nova.scheduler.client.report [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.629 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.668 2 INFO nova.scheduler.client.report [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:08 compute-0 nova_compute[192810]: 2025-09-30 21:50:08.794 2 DEBUG oslo_concurrency.lockutils [None req-51784d32-de8c-45bc-b7d7-3d654a43b64a 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.045 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.046 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.046 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.047 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.047 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No waiting events found dispatching network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.048 2 WARNING nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received unexpected event network-vif-plugged-15d18429-a32d-4645-bc6f-dcdc238c5de9 for instance with vm_state deleted and task_state None.
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.048 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-unplugged-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.049 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.049 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.050 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.050 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No waiting events found dispatching network-vif-unplugged-15c82795-bc8d-4e4d-9949-ded082705cd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.051 2 WARNING nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received unexpected event network-vif-unplugged-15c82795-bc8d-4e4d-9949-ded082705cd7 for instance with vm_state deleted and task_state None.
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.051 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.052 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.052 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.053 2 DEBUG oslo_concurrency.lockutils [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "2f0fbc4b-01ff-4422-bed8-aa7ca8934f51-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.053 2 DEBUG nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] No waiting events found dispatching network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.054 2 WARNING nova.compute.manager [req-a0d2b5c2-cb26-412c-a1bc-2f91b6f02769 req-7bdb42a4-ebae-4526-93b6-d2661aa4d4e9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Received unexpected event network-vif-plugged-15c82795-bc8d-4e4d-9949-ded082705cd7 for instance with vm_state deleted and task_state None.
Sep 30 21:50:09 compute-0 sshd-session[246878]: Connection closed by authenticating user root 8.210.178.40 port 53462 [preauth]
Sep 30 21:50:09 compute-0 nova_compute[192810]: 2025-09-30 21:50:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.806 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.806 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.807 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.807 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:50:10 compute-0 unix_chkpwd[247230]: password check failed for user (root)
Sep 30 21:50:10 compute-0 sshd-session[247227]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.952 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.953 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.22970962524414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.954 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:10 compute-0 nova_compute[192810]: 2025-09-30 21:50:10.954 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.012 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.012 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.036 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.053 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.078 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.078 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.078 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.079 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.104 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.404 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759268996.4028947, 4ee4a775-05d1-45fc-b4f9-566ab8159710 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.404 2 INFO nova.compute.manager [-] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] VM Stopped (Lifecycle Event)
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.431 2 DEBUG nova.compute.manager [None req-5605c1db-987b-4fe4-ab60-556845a2c07a - - - - - -] [instance: 4ee4a775-05d1-45fc-b4f9-566ab8159710] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.735 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:11 compute-0 nova_compute[192810]: 2025-09-30 21:50:11.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-0 nova_compute[192810]: 2025-09-30 21:50:12.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:12 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:12 compute-0 unix_chkpwd[247231]: password check failed for user (root)
Sep 30 21:50:13 compute-0 nova_compute[192810]: 2025-09-30 21:50:13.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:15 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:16 compute-0 nova_compute[192810]: 2025-09-30 21:50:16.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:16 compute-0 unix_chkpwd[247232]: password check failed for user (root)
Sep 30 21:50:17 compute-0 nova_compute[192810]: 2025-09-30 21:50:17.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:50:18 compute-0 nova_compute[192810]: 2025-09-30 21:50:18.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:18 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:20 compute-0 nova_compute[192810]: 2025-09-30 21:50:20.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:20 compute-0 unix_chkpwd[247233]: password check failed for user (root)
Sep 30 21:50:21 compute-0 podman[247235]: 2025-09-30 21:50:21.321942743 +0000 UTC m=+0.055536649 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 21:50:21 compute-0 podman[247234]: 2025-09-30 21:50:21.356405559 +0000 UTC m=+0.089303648 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true)
Sep 30 21:50:21 compute-0 podman[247236]: 2025-09-30 21:50:21.356912402 +0000 UTC m=+0.088044677 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:50:21 compute-0 nova_compute[192810]: 2025-09-30 21:50:21.825 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269006.8242133, 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:21 compute-0 nova_compute[192810]: 2025-09-30 21:50:21.826 2 INFO nova.compute.manager [-] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] VM Stopped (Lifecycle Event)
Sep 30 21:50:21 compute-0 nova_compute[192810]: 2025-09-30 21:50:21.866 2 DEBUG nova.compute.manager [None req-eb2915b9-33b4-4180-8384-1734e16bde40 - - - - - -] [instance: 2f0fbc4b-01ff-4422-bed8-aa7ca8934f51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:21 compute-0 nova_compute[192810]: 2025-09-30 21:50:21.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:22 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:23 compute-0 unix_chkpwd[247297]: password check failed for user (root)
Sep 30 21:50:23 compute-0 nova_compute[192810]: 2025-09-30 21:50:23.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:25 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:26 compute-0 nova_compute[192810]: 2025-09-30 21:50:26.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:27 compute-0 unix_chkpwd[247299]: password check failed for user (root)
Sep 30 21:50:27 compute-0 unix_chkpwd[247301]: password check failed for user (root)
Sep 30 21:50:27 compute-0 sshd-session[247298]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 21:50:28 compute-0 nova_compute[192810]: 2025-09-30 21:50:28.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:29 compute-0 sshd-session[247227]: Failed password for root from 8.210.178.40 port 54090 ssh2
Sep 30 21:50:29 compute-0 nova_compute[192810]: 2025-09-30 21:50:29.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:30 compute-0 sshd-session[247298]: Failed password for root from 185.156.73.233 port 16138 ssh2
Sep 30 21:50:30 compute-0 podman[247302]: 2025-09-30 21:50:30.319323481 +0000 UTC m=+0.055554130 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:50:30 compute-0 podman[247303]: 2025-09-30 21:50:30.329271858 +0000 UTC m=+0.063456686 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal)
Sep 30 21:50:30 compute-0 nova_compute[192810]: 2025-09-30 21:50:30.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:30 compute-0 nova_compute[192810]: 2025-09-30 21:50:30.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:30 compute-0 sshd-session[247227]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 54090 ssh2 [preauth]
Sep 30 21:50:30 compute-0 sshd-session[247227]: Disconnecting authenticating user root 8.210.178.40 port 54090: Too many authentication failures [preauth]
Sep 30 21:50:30 compute-0 sshd-session[247227]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:30 compute-0 sshd-session[247227]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:50:31 compute-0 sshd-session[247298]: Connection closed by authenticating user root 185.156.73.233 port 16138 [preauth]
Sep 30 21:50:31 compute-0 nova_compute[192810]: 2025-09-30 21:50:31.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:33 compute-0 unix_chkpwd[247350]: password check failed for user (root)
Sep 30 21:50:33 compute-0 sshd-session[247348]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:33 compute-0 nova_compute[192810]: 2025-09-30 21:50:33.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:35 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:36 compute-0 nova_compute[192810]: 2025-09-30 21:50:36.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:37 compute-0 unix_chkpwd[247351]: password check failed for user (root)
Sep 30 21:50:37 compute-0 podman[247354]: 2025-09-30 21:50:37.319422792 +0000 UTC m=+0.050621108 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:50:37 compute-0 podman[247352]: 2025-09-30 21:50:37.324495628 +0000 UTC m=+0.062102093 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 21:50:37 compute-0 podman[247353]: 2025-09-30 21:50:37.324434216 +0000 UTC m=+0.055297334 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 21:50:38 compute-0 nova_compute[192810]: 2025-09-30 21:50:38.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:38 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:38.756 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:39 compute-0 unix_chkpwd[247410]: password check failed for user (root)
Sep 30 21:50:41 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:41 compute-0 nova_compute[192810]: 2025-09-30 21:50:41.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-0 unix_chkpwd[247412]: password check failed for user (root)
Sep 30 21:50:43 compute-0 nova_compute[192810]: 2025-09-30 21:50:43.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.910 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:50:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.296 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.298 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.371 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.590 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.590 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.600 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.601 2 INFO nova.compute.claims [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.822 2 DEBUG nova.compute.provider_tree [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.838 2 DEBUG nova.scheduler.client.report [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.865 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:45 compute-0 nova_compute[192810]: 2025-09-30 21:50:45.865 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.007 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.008 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.030 2 INFO nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.076 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.349 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.350 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.351 2 INFO nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Creating image(s)
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.351 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.352 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.352 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.367 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.421 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.422 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.423 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.434 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.487 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.488 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.541 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.542 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.542 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.594 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.595 2 DEBUG nova.virt.disk.api [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Checking if we can resize image /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.596 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.650 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.651 2 DEBUG nova.virt.disk.api [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Cannot resize image /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.651 2 DEBUG nova.objects.instance [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'migration_context' on Instance uuid 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.698 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.699 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Ensure instance console log exists: /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.699 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.699 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.700 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:46 compute-0 nova_compute[192810]: 2025-09-30 21:50:46.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:47 compute-0 unix_chkpwd[247428]: password check failed for user (root)
Sep 30 21:50:47 compute-0 nova_compute[192810]: 2025-09-30 21:50:47.835 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Successfully created port: 08103e16-9b92-4dc8-a0e5-925d87c4fbfd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:50:48 compute-0 nova_compute[192810]: 2025-09-30 21:50:48.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.217 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Successfully updated port: 08103e16-9b92-4dc8-a0e5-925d87c4fbfd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.264 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.264 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquired lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.264 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.439 2 DEBUG nova.compute.manager [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-changed-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.440 2 DEBUG nova.compute.manager [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Refreshing instance network info cache due to event network-changed-08103e16-9b92-4dc8-a0e5-925d87c4fbfd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.440 2 DEBUG oslo_concurrency.lockutils [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:50:49 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:49 compute-0 nova_compute[192810]: 2025-09-30 21:50:49.599 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:50:50 compute-0 nova_compute[192810]: 2025-09-30 21:50:50.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:50.418 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:50 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:50.419 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:50:51 compute-0 unix_chkpwd[247429]: password check failed for user (root)
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.332 2 DEBUG nova.network.neutron [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updating instance_info_cache with network_info: [{"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.349 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Releasing lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.349 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Instance network_info: |[{"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.350 2 DEBUG oslo_concurrency.lockutils [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.350 2 DEBUG nova.network.neutron [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Refreshing network info cache for port 08103e16-9b92-4dc8-a0e5-925d87c4fbfd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.352 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Start _get_guest_xml network_info=[{"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.355 2 WARNING nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.366 2 DEBUG nova.virt.libvirt.host [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.367 2 DEBUG nova.virt.libvirt.host [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.371 2 DEBUG nova.virt.libvirt.host [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.371 2 DEBUG nova.virt.libvirt.host [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.372 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.372 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.373 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.373 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.373 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.373 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.374 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.374 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.374 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.374 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.375 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.375 2 DEBUG nova.virt.hardware [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.378 2 DEBUG nova.virt.libvirt.vif [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2112337209',display_name='tempest-TestServerMultinode-server-2112337209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2112337209',id=171,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-r58w64d0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:46Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=49fa1fe9-fac0-491b-a5d9-ee6230be2b1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.378 2 DEBUG nova.network.os_vif_util [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.379 2 DEBUG nova.network.os_vif_util [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.380 2 DEBUG nova.objects.instance [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'pci_devices' on Instance uuid 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.392 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <uuid>49fa1fe9-fac0-491b-a5d9-ee6230be2b1f</uuid>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <name>instance-000000ab</name>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:name>tempest-TestServerMultinode-server-2112337209</nova:name>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:50:51</nova:creationTime>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:user uuid="39fdea9ccc694a1aa18451c9c7b3bdcc">tempest-TestServerMultinode-104702739-project-admin</nova:user>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:project uuid="03a9c17c97284fddad18a4babc1ac469">tempest-TestServerMultinode-104702739</nova:project>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         <nova:port uuid="08103e16-9b92-4dc8-a0e5-925d87c4fbfd">
Sep 30 21:50:51 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <system>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="serial">49fa1fe9-fac0-491b-a5d9-ee6230be2b1f</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="uuid">49fa1fe9-fac0-491b-a5d9-ee6230be2b1f</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </system>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <os>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </os>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <features>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </features>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.config"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:40:bc:0a"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <target dev="tap08103e16-9b"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/console.log" append="off"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <video>
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </video>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:50:51 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:50:51 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:50:51 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:50:51 compute-0 nova_compute[192810]: </domain>
Sep 30 21:50:51 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.393 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Preparing to wait for external event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.393 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.393 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.394 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.394 2 DEBUG nova.virt.libvirt.vif [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2112337209',display_name='tempest-TestServerMultinode-server-2112337209',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2112337209',id=171,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-r58w64d0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:50:46Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=49fa1fe9-fac0-491b-a5d9-ee6230be2b1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.395 2 DEBUG nova.network.os_vif_util [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.395 2 DEBUG nova.network.os_vif_util [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.395 2 DEBUG os_vif [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.396 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.396 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.400 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08103e16-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.400 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap08103e16-9b, col_values=(('external_ids', {'iface-id': '08103e16-9b92-4dc8-a0e5-925d87c4fbfd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:bc:0a', 'vm-uuid': '49fa1fe9-fac0-491b-a5d9-ee6230be2b1f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:51 compute-0 NetworkManager[51733]: <info>  [1759269051.4020] manager: (tap08103e16-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.407 2 INFO os_vif [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b')
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.452 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.453 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.453 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] No VIF found with MAC fa:16:3e:40:bc:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:50:51 compute-0 nova_compute[192810]: 2025-09-30 21:50:51.453 2 INFO nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Using config drive
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.138 2 INFO nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Creating config drive at /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.config
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.142 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsvbnwtje execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.268 2 DEBUG oslo_concurrency.processutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsvbnwtje" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:50:52 compute-0 podman[247436]: 2025-09-30 21:50:52.314343288 +0000 UTC m=+0.048410153 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 21:50:52 compute-0 kernel: tap08103e16-9b: entered promiscuous mode
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.3264] manager: (tap08103e16-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/294)
Sep 30 21:50:52 compute-0 podman[247437]: 2025-09-30 21:50:52.32773221 +0000 UTC m=+0.058069373 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 ovn_controller[94912]: 2025-09-30T21:50:52Z|00660|binding|INFO|Claiming lport 08103e16-9b92-4dc8-a0e5-925d87c4fbfd for this chassis.
Sep 30 21:50:52 compute-0 ovn_controller[94912]: 2025-09-30T21:50:52Z|00661|binding|INFO|08103e16-9b92-4dc8-a0e5-925d87c4fbfd: Claiming fa:16:3e:40:bc:0a 10.100.0.3
Sep 30 21:50:52 compute-0 systemd-udevd[247509]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.361 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:bc:0a 10.100.0.3'], port_security=['fa:16:3e:40:bc:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '49fa1fe9-fac0-491b-a5d9-ee6230be2b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03a9c17c97284fddad18a4babc1ac469', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ea43e653-1ffa-4770-a68a-2b8fd3d4fb27', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2923dc02-5672-47c9-85d4-ce6c19aec13e, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=08103e16-9b92-4dc8-a0e5-925d87c4fbfd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.362 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 08103e16-9b92-4dc8-a0e5-925d87c4fbfd in datapath 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff bound to our chassis
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.363 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:52 compute-0 systemd-machined[152794]: New machine qemu-81-instance-000000ab.
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.3726] device (tap08103e16-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.3735] device (tap08103e16-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.374 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dfb88bc7-925c-4bca-8f02-13f0d48bafc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.375 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7b3b9e17-81 in ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.376 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7b3b9e17-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.376 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f6bc3a95-f2b7-4f6f-9e0f-6cfda4bf0aa6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.377 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[515b1ded-5fe5-4aae-9c43-d8ad7b3a2938]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 systemd[1]: Started Virtual Machine qemu-81-instance-000000ab.
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.389 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ec3367-e5a2-49b9-b8eb-c1f6765e85c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_controller[94912]: 2025-09-30T21:50:52Z|00662|binding|INFO|Setting lport 08103e16-9b92-4dc8-a0e5-925d87c4fbfd ovn-installed in OVS
Sep 30 21:50:52 compute-0 ovn_controller[94912]: 2025-09-30T21:50:52Z|00663|binding|INFO|Setting lport 08103e16-9b92-4dc8-a0e5-925d87c4fbfd up in Southbound
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.411 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e57045b1-d47c-40c7-915a-7ab4b039c261]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 podman[247435]: 2025-09-30 21:50:52.416949095 +0000 UTC m=+0.153137033 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20250923)
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.437 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[f3af10df-d7d7-4cd0-8186-fe30f84fd0cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.441 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[901a06cd-3c37-4c37-92a3-b1b42e382591]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.4427] manager: (tap7b3b9e17-80): new Veth device (/org/freedesktop/NetworkManager/Devices/295)
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.474 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[317ec7c2-92b1-4e3b-a855-6d4b647d9c35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.477 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[b3992c16-5fec-41d9-a8f2-22ecb64410a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.4979] device (tap7b3b9e17-80): carrier: link connected
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.502 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[243344da-6fc6-4176-bb35-9dbe88959aef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.518 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f28ac043-03b9-4a47-87ab-be2ed0f4aac5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3b9e17-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:22:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574812, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247542, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.533 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eb37e13e-1a45-4bff-90bb-06cbb525fb9f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:229d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 574812, 'tstamp': 574812}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247543, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.548 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[93eedb88-24a0-4cfd-8bb0-0f5a27a32067]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7b3b9e17-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:22:9d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574812, 'reachable_time': 23963, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247544, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.575 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5605fb06-347e-43f3-970c-1003eeb26e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.631 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[18faa029-3009-4803-845d-cdb8ee6a6cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.632 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3b9e17-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.633 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.633 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b3b9e17-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 NetworkManager[51733]: <info>  [1759269052.6352] manager: (tap7b3b9e17-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Sep 30 21:50:52 compute-0 kernel: tap7b3b9e17-80: entered promiscuous mode
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.638 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7b3b9e17-80, col_values=(('external_ids', {'iface-id': '2454d4c5-8981-43c0-9c8f-e79a575dc065'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 ovn_controller[94912]: 2025-09-30T21:50:52Z|00664|binding|INFO|Releasing lport 2454d4c5-8981-43c0-9c8f-e79a575dc065 from this chassis (sb_readonly=0)
Sep 30 21:50:52 compute-0 nova_compute[192810]: 2025-09-30 21:50:52.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.651 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.652 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dce5e782-8e92-47a3-bc9d-afd792ca72d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.653 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.pid.haproxy
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:50:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:52.655 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'env', 'PROCESS_TAG=haproxy-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7b3b9e17-84a1-4306-91a0-7da7cacfb7ff.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:50:53 compute-0 podman[247583]: 2025-09-30 21:50:53.022633083 +0000 UTC m=+0.052969796 container create 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:50:53 compute-0 systemd[1]: Started libpod-conmon-3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4.scope.
Sep 30 21:50:53 compute-0 podman[247583]: 2025-09-30 21:50:52.995235803 +0000 UTC m=+0.025572546 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:50:53 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:50:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb79fea702c238397c1dad81781d6d9347d0e6c3b4d1801fc8aa465f115cd386/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:50:53 compute-0 podman[247583]: 2025-09-30 21:50:53.11957388 +0000 UTC m=+0.149910613 container init 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:50:53 compute-0 podman[247583]: 2025-09-30 21:50:53.126258646 +0000 UTC m=+0.156595359 container start 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:50:53 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [NOTICE]   (247602) : New worker (247604) forked
Sep 30 21:50:53 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [NOTICE]   (247602) : Loading success.
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.216 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269053.2158139, 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.217 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] VM Started (Lifecycle Event)
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.238 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.242 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269053.2159312, 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.242 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] VM Paused (Lifecycle Event)
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.259 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.263 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.282 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:50:53 compute-0 sshd-session[247348]: Failed password for root from 8.210.178.40 port 54872 ssh2
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.619 2 DEBUG nova.network.neutron [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updated VIF entry in instance network info cache for port 08103e16-9b92-4dc8-a0e5-925d87c4fbfd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.620 2 DEBUG nova.network.neutron [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updating instance_info_cache with network_info: [{"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:50:53 compute-0 nova_compute[192810]: 2025-09-30 21:50:53.637 2 DEBUG oslo_concurrency.lockutils [req-f52b5287-c96b-4217-9ff4-12736c459faa req-7208c14d-0930-4f09-bd90-b10b3f19b7fc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:50:55 compute-0 sshd-session[247348]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 54872 ssh2 [preauth]
Sep 30 21:50:55 compute-0 sshd-session[247348]: Disconnecting authenticating user root 8.210.178.40 port 54872: Too many authentication failures [preauth]
Sep 30 21:50:55 compute-0 sshd-session[247348]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:55 compute-0 sshd-session[247348]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:50:56 compute-0 nova_compute[192810]: 2025-09-30 21:50:56.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:50:56.420 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:50:56 compute-0 unix_chkpwd[247615]: password check failed for user (root)
Sep 30 21:50:56 compute-0 sshd-session[247613]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.203 2 DEBUG nova.compute.manager [req-685c20e6-85c6-45a0-89a9-320e7e5194a3 req-86c84a9e-b382-4657-b99d-dc86900b6682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.204 2 DEBUG oslo_concurrency.lockutils [req-685c20e6-85c6-45a0-89a9-320e7e5194a3 req-86c84a9e-b382-4657-b99d-dc86900b6682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.204 2 DEBUG oslo_concurrency.lockutils [req-685c20e6-85c6-45a0-89a9-320e7e5194a3 req-86c84a9e-b382-4657-b99d-dc86900b6682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.205 2 DEBUG oslo_concurrency.lockutils [req-685c20e6-85c6-45a0-89a9-320e7e5194a3 req-86c84a9e-b382-4657-b99d-dc86900b6682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.205 2 DEBUG nova.compute.manager [req-685c20e6-85c6-45a0-89a9-320e7e5194a3 req-86c84a9e-b382-4657-b99d-dc86900b6682 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Processing event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.206 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.209 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269058.2090278, 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.209 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] VM Resumed (Lifecycle Event)
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.211 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.214 2 INFO nova.virt.libvirt.driver [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Instance spawned successfully.
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.215 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.249 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.254 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.254 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.255 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.255 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.255 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.256 2 DEBUG nova.virt.libvirt.driver [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.260 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.318 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.358 2 INFO nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Took 12.01 seconds to spawn the instance on the hypervisor.
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.359 2 DEBUG nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.450 2 INFO nova.compute.manager [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Took 12.92 seconds to build instance.
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.477 2 DEBUG oslo_concurrency.lockutils [None req-082c4bbf-2b00-4003-83e3-ef7a30686aa9 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:50:58 compute-0 nova_compute[192810]: 2025-09-30 21:50:58.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:50:58 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:50:59 compute-0 unix_chkpwd[247616]: password check failed for user (root)
Sep 30 21:51:01 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:51:01 compute-0 podman[247617]: 2025-09-30 21:51:01.327326144 +0000 UTC m=+0.058793111 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 21:51:01 compute-0 podman[247618]: 2025-09-30 21:51:01.33238503 +0000 UTC m=+0.064029231 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:51:01 compute-0 nova_compute[192810]: 2025-09-30 21:51:01.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:01 compute-0 nova_compute[192810]: 2025-09-30 21:51:01.804 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:02 compute-0 nova_compute[192810]: 2025-09-30 21:51:02.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:02 compute-0 nova_compute[192810]: 2025-09-30 21:51:02.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:51:02 compute-0 unix_chkpwd[247659]: password check failed for user (root)
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.423 2 DEBUG nova.compute.manager [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.424 2 DEBUG oslo_concurrency.lockutils [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.424 2 DEBUG oslo_concurrency.lockutils [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.424 2 DEBUG oslo_concurrency.lockutils [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.424 2 DEBUG nova.compute.manager [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] No waiting events found dispatching network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.425 2 WARNING nova.compute.manager [req-40924562-19d6-4e0e-a62f-55240561743a req-71396419-f4d7-4820-91ed-708987e172b0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received unexpected event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd for instance with vm_state active and task_state None.
Sep 30 21:51:03 compute-0 nova_compute[192810]: 2025-09-30 21:51:03.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:04 compute-0 nova_compute[192810]: 2025-09-30 21:51:04.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:05 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:51:06 compute-0 nova_compute[192810]: 2025-09-30 21:51:06.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:06 compute-0 unix_chkpwd[247660]: password check failed for user (root)
Sep 30 21:51:07 compute-0 nova_compute[192810]: 2025-09-30 21:51:07.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:07 compute-0 nova_compute[192810]: 2025-09-30 21:51:07.949 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:07 compute-0 nova_compute[192810]: 2025-09-30 21:51:07.949 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.056 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:51:08 compute-0 podman[247661]: 2025-09-30 21:51:08.3224464 +0000 UTC m=+0.057007137 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.328 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.328 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-0 podman[247662]: 2025-09-30 21:51:08.33573844 +0000 UTC m=+0.066674277 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:08 compute-0 podman[247663]: 2025-09-30 21:51:08.339305498 +0000 UTC m=+0.063137698 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.450 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.450 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.455 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.460 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.461 2 INFO nova.compute.claims [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:08 compute-0 nova_compute[192810]: 2025-09-30 21:51:08.841 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.123 2 DEBUG nova.compute.provider_tree [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.147 2 DEBUG nova.scheduler.client.report [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.179 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.180 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.183 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.189 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.189 2 INFO nova.compute.claims [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.294 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.294 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.316 2 INFO nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.340 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.429 2 DEBUG nova.compute.provider_tree [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:09 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.456 2 DEBUG nova.scheduler.client.report [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.488 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.489 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.489 2 INFO nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Creating image(s)
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.490 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.490 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.491 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.502 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.502 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.505 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.560 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.561 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.562 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.565 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.566 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.581 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.602 2 INFO nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.626 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.633 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.634 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.664 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.665 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.666 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.717 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.717 2 DEBUG nova.virt.disk.api [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.718 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.773 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.775 2 DEBUG nova.virt.disk.api [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.776 2 DEBUG nova.objects.instance [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid 06072d75-591c-4422-8b92-2176427d6b4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.785 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.788 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.788 2 INFO nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Creating image(s)
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.789 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.790 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.791 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.817 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.818 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Ensure instance console log exists: /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.818 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.819 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.819 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.820 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.820 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.820 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.823 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.846 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.847 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.879 2 DEBUG nova.policy [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11769cb7bff84bcc8c33eaff10e3e1ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.883 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.883 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.884 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.895 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.915 2 DEBUG nova.policy [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.954 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.955 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.998 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:09 compute-0 nova_compute[192810]: 2025-09-30 21:51:09.999 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.000 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.058 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.060 2 DEBUG nova.virt.disk.api [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Checking if we can resize image /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.060 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.114 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.115 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.115 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.115 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.119 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.119 2 DEBUG nova.virt.disk.api [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Cannot resize image /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.120 2 DEBUG nova.objects.instance [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'migration_context' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.148 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.148 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Ensure instance console log exists: /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.149 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.149 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:10 compute-0 nova_compute[192810]: 2025-09-30 21:51:10.150 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:10 compute-0 unix_chkpwd[247776]: password check failed for user (root)
Sep 30 21:51:11 compute-0 ovn_controller[94912]: 2025-09-30T21:51:11Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:bc:0a 10.100.0.3
Sep 30 21:51:11 compute-0 ovn_controller[94912]: 2025-09-30T21:51:11Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:bc:0a 10.100.0.3
Sep 30 21:51:11 compute-0 nova_compute[192810]: 2025-09-30 21:51:11.044 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Successfully created port: f8816163-e9f2-4e5b-8430-35d972cba5ec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:51:11 compute-0 nova_compute[192810]: 2025-09-30 21:51:11.321 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Successfully created port: 25422545-2c61-41ab-9982-ca7761a24544 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:51:11 compute-0 nova_compute[192810]: 2025-09-30 21:51:11.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.005 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.006 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.006 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.006 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.007 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.016 2 INFO nova.compute.manager [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Terminating instance
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.028 2 DEBUG nova.compute.manager [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:51:12 compute-0 kernel: tap08103e16-9b (unregistering): left promiscuous mode
Sep 30 21:51:12 compute-0 NetworkManager[51733]: <info>  [1759269072.0597] device (tap08103e16-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:12 compute-0 ovn_controller[94912]: 2025-09-30T21:51:12Z|00665|binding|INFO|Releasing lport 08103e16-9b92-4dc8-a0e5-925d87c4fbfd from this chassis (sb_readonly=0)
Sep 30 21:51:12 compute-0 ovn_controller[94912]: 2025-09-30T21:51:12Z|00666|binding|INFO|Setting lport 08103e16-9b92-4dc8-a0e5-925d87c4fbfd down in Southbound
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 ovn_controller[94912]: 2025-09-30T21:51:12Z|00667|binding|INFO|Removing iface tap08103e16-9b ovn-installed in OVS
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.075 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:bc:0a 10.100.0.3'], port_security=['fa:16:3e:40:bc:0a 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '49fa1fe9-fac0-491b-a5d9-ee6230be2b1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '03a9c17c97284fddad18a4babc1ac469', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ea43e653-1ffa-4770-a68a-2b8fd3d4fb27', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2923dc02-5672-47c9-85d4-ce6c19aec13e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=08103e16-9b92-4dc8-a0e5-925d87c4fbfd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.077 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 08103e16-9b92-4dc8-a0e5-925d87c4fbfd in datapath 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff unbound from our chassis
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.078 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.080 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[33fc4a74-6dd3-4c2f-ab26-45bd02a511cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.080 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff namespace which is not needed anymore
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Sep 30 21:51:12 compute-0 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000ab.scope: Consumed 12.482s CPU time.
Sep 30 21:51:12 compute-0 systemd-machined[152794]: Machine qemu-81-instance-000000ab terminated.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.179 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Successfully created port: 6c5de877-502e-47f1-b767-d9d472689330 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:51:12 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [NOTICE]   (247602) : haproxy version is 2.8.14-c23fe91
Sep 30 21:51:12 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [NOTICE]   (247602) : path to executable is /usr/sbin/haproxy
Sep 30 21:51:12 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [WARNING]  (247602) : Exiting Master process...
Sep 30 21:51:12 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [ALERT]    (247602) : Current worker (247604) exited with code 143 (Terminated)
Sep 30 21:51:12 compute-0 neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff[247598]: [WARNING]  (247602) : All workers exited. Exiting... (0)
Sep 30 21:51:12 compute-0 systemd[1]: libpod-3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4.scope: Deactivated successfully.
Sep 30 21:51:12 compute-0 podman[247801]: 2025-09-30 21:51:12.197205443 +0000 UTC m=+0.041823069 container died 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:51:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4-userdata-shm.mount: Deactivated successfully.
Sep 30 21:51:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-bb79fea702c238397c1dad81781d6d9347d0e6c3b4d1801fc8aa465f115cd386-merged.mount: Deactivated successfully.
Sep 30 21:51:12 compute-0 podman[247801]: 2025-09-30 21:51:12.228906911 +0000 UTC m=+0.073524537 container cleanup 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:12 compute-0 systemd[1]: libpod-conmon-3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4.scope: Deactivated successfully.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.286 2 INFO nova.virt.libvirt.driver [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Instance destroyed successfully.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.287 2 DEBUG nova.objects.instance [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lazy-loading 'resources' on Instance uuid 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:12 compute-0 podman[247830]: 2025-09-30 21:51:12.289499735 +0000 UTC m=+0.042143227 container remove 3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.291 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updating instance_info_cache with network_info: [{"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.297 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2522b8-7e67-4748-98f5-6f42b119160b]: (4, ('Tue Sep 30 09:51:12 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff (3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4)\n3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4\nTue Sep 30 09:51:12 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff (3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4)\n3a278d2a69bc51d5044a515a7f45e3fc75559ccfe693a646772b559b9cd419f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.298 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d34c6f23-6ab2-432a-bfeb-920613679077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.299 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b3b9e17-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 kernel: tap7b3b9e17-80: left promiscuous mode
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.318 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[de15deac-3081-48a9-ab09-0c20d7d38285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.318 2 DEBUG nova.virt.libvirt.vif [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:50:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-2112337209',display_name='tempest-TestServerMultinode-server-2112337209',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-2112337209',id=171,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:50:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='03a9c17c97284fddad18a4babc1ac469',ramdisk_id='',reservation_id='r-r58w64d0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-104702739',owner_user_name='tempest-TestServerMultinode-104702739-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:50:58Z,user_data=None,user_id='39fdea9ccc694a1aa18451c9c7b3bdcc',uuid=49fa1fe9-fac0-491b-a5d9-ee6230be2b1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.319 2 DEBUG nova.network.os_vif_util [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converting VIF {"id": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "address": "fa:16:3e:40:bc:0a", "network": {"id": "7b3b9e17-84a1-4306-91a0-7da7cacfb7ff", "bridge": "br-int", "label": "tempest-TestServerMultinode-185555687-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "f43a7c47a87248c1b68d9a785baccf21", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap08103e16-9b", "ovs_interfaceid": "08103e16-9b92-4dc8-a0e5-925d87c4fbfd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.319 2 DEBUG nova.network.os_vif_util [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.319 2 DEBUG os_vif [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.324 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08103e16-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.325 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.326 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.326 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.327 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.331 2 INFO os_vif [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:bc:0a,bridge_name='br-int',has_traffic_filtering=True,id=08103e16-9b92-4dc8-a0e5-925d87c4fbfd,network=Network(7b3b9e17-84a1-4306-91a0-7da7cacfb7ff),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap08103e16-9b')
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.331 2 INFO nova.virt.libvirt.driver [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Deleting instance files /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f_del
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.332 2 INFO nova.virt.libvirt.driver [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Deletion of /var/lib/nova/instances/49fa1fe9-fac0-491b-a5d9-ee6230be2b1f_del complete
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.349 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[db4d8db1-db56-4063-9ed9-5328bc660a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.350 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8762ba23-dec4-4ec0-92f2-61661f364b3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.365 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b443c1a7-ed94-4a9f-bddf-7c9b2cc0a487]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 574805, 'reachable_time': 31166, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247866, 'error': None, 'target': 'ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.367 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7b3b9e17-84a1-4306-91a0-7da7cacfb7ff deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:51:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:12.367 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[9c944924-687f-4e62-b8d0-89dd0831e4d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:12 compute-0 systemd[1]: run-netns-ovnmeta\x2d7b3b9e17\x2d84a1\x2d4306\x2d91a0\x2d7da7cacfb7ff.mount: Deactivated successfully.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.394 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Successfully updated port: f8816163-e9f2-4e5b-8430-35d972cba5ec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.410 2 DEBUG nova.compute.manager [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-unplugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.410 2 DEBUG oslo_concurrency.lockutils [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.410 2 DEBUG oslo_concurrency.lockutils [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.411 2 DEBUG oslo_concurrency.lockutils [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.411 2 DEBUG nova.compute.manager [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] No waiting events found dispatching network-vif-unplugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.411 2 DEBUG nova.compute.manager [req-f94b85d6-db72-41a6-9f3a-b3906cdf550e req-d9120af7-ff7c-4d21-a570-68da40de3365 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-unplugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.412 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.412 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquired lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.412 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.453 2 INFO nova.compute.manager [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Took 0.42 seconds to destroy the instance on the hypervisor.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.453 2 DEBUG oslo.service.loopingcall [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.453 2 DEBUG nova.compute.manager [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.454 2 DEBUG nova.network.neutron [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.574 2 DEBUG nova.compute.manager [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-changed-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.574 2 DEBUG nova.compute.manager [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Refreshing instance network info cache due to event network-changed-f8816163-e9f2-4e5b-8430-35d972cba5ec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.574 2 DEBUG oslo_concurrency.lockutils [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.690 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.816 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:51:12 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.988 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.989 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5682MB free_disk=73.22882080078125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.989 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:12 compute-0 nova_compute[192810]: 2025-09-30 21:51:12.990 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.095 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.095 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 06072d75-591c-4422-8b92-2176427d6b4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.095 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 92360dac-27fd-4b9d-a568-c4679e939762 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.096 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.096 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.152 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Successfully updated port: 25422545-2c61-41ab-9982-ca7761a24544 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.229 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.245 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.280 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.281 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.293 2 DEBUG nova.network.neutron [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.319 2 INFO nova.compute.manager [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Took 0.87 seconds to deallocate network for instance.
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.405 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.406 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.497 2 DEBUG nova.compute.provider_tree [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.513 2 DEBUG nova.scheduler.client.report [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.550 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.613 2 INFO nova.scheduler.client.report [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Deleted allocations for instance 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f
Sep 30 21:51:13 compute-0 nova_compute[192810]: 2025-09-30 21:51:13.704 2 DEBUG oslo_concurrency.lockutils [None req-ee7767ba-c912-44a7-8535-c94a76f6e208 39fdea9ccc694a1aa18451c9c7b3bdcc 03a9c17c97284fddad18a4babc1ac469 - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.118 2 DEBUG nova.network.neutron [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updating instance_info_cache with network_info: [{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.154 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Releasing lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.154 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance network_info: |[{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.155 2 DEBUG oslo_concurrency.lockutils [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.155 2 DEBUG nova.network.neutron [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Refreshing network info cache for port f8816163-e9f2-4e5b-8430-35d972cba5ec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.157 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Start _get_guest_xml network_info=[{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.162 2 WARNING nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.188 2 DEBUG nova.virt.libvirt.host [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.189 2 DEBUG nova.virt.libvirt.host [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.196 2 DEBUG nova.virt.libvirt.host [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.196 2 DEBUG nova.virt.libvirt.host [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.197 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.197 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.198 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.198 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.198 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.198 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.199 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.199 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.199 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.199 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.199 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.200 2 DEBUG nova.virt.hardware [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.203 2 DEBUG nova.virt.libvirt.vif [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1327581824',display_name='tempest-TestServerAdvancedOps-server-1327581824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1327581824',id=174,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e9cc864ebd6746dc8aae9d41edaa3753',ramdisk_id='',reservation_id='r-2lfte01b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-517934233',owner_user_name='tempest-TestServerAdvancedOps-517934233-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='11769cb7bff84bcc8c33eaff10e3e1ba',uuid=92360dac-27fd-4b9d-a568-c4679e939762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.203 2 DEBUG nova.network.os_vif_util [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converting VIF {"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.204 2 DEBUG nova.network.os_vif_util [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.205 2 DEBUG nova.objects.instance [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.216 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <uuid>92360dac-27fd-4b9d-a568-c4679e939762</uuid>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <name>instance-000000ae</name>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:name>tempest-TestServerAdvancedOps-server-1327581824</nova:name>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:51:14</nova:creationTime>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:user uuid="11769cb7bff84bcc8c33eaff10e3e1ba">tempest-TestServerAdvancedOps-517934233-project-member</nova:user>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:project uuid="e9cc864ebd6746dc8aae9d41edaa3753">tempest-TestServerAdvancedOps-517934233</nova:project>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         <nova:port uuid="f8816163-e9f2-4e5b-8430-35d972cba5ec">
Sep 30 21:51:14 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.2" ipVersion="4"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <system>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="serial">92360dac-27fd-4b9d-a568-c4679e939762</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="uuid">92360dac-27fd-4b9d-a568-c4679e939762</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </system>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <os>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </os>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <features>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </features>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.config"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:5f:48:ea"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <target dev="tapf8816163-e9"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/console.log" append="off"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <video>
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </video>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:51:14 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:51:14 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:51:14 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:51:14 compute-0 nova_compute[192810]: </domain>
Sep 30 21:51:14 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.218 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Preparing to wait for external event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.218 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.218 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.219 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.219 2 DEBUG nova.virt.libvirt.vif [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1327581824',display_name='tempest-TestServerAdvancedOps-server-1327581824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1327581824',id=174,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e9cc864ebd6746dc8aae9d41edaa3753',ramdisk_id='',reservation_id='r-2lfte01b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerAdvancedOps-517934233',owner_user_name='tempest-TestServerAdvancedOps-517934233-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='11769cb7bff84bcc8c33eaff10e3e1ba',uuid=92360dac-27fd-4b9d-a568-c4679e939762,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.220 2 DEBUG nova.network.os_vif_util [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converting VIF {"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.220 2 DEBUG nova.network.os_vif_util [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.220 2 DEBUG os_vif [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8816163-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.225 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf8816163-e9, col_values=(('external_ids', {'iface-id': 'f8816163-e9f2-4e5b-8430-35d972cba5ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:48:ea', 'vm-uuid': '92360dac-27fd-4b9d-a568-c4679e939762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-0 NetworkManager[51733]: <info>  [1759269074.2269] manager: (tapf8816163-e9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/297)
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.233 2 INFO os_vif [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9')
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.282 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.282 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.282 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] No VIF found with MAC fa:16:3e:5f:48:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.283 2 INFO nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Using config drive
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.646 2 DEBUG nova.compute.manager [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.646 2 DEBUG oslo_concurrency.lockutils [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.647 2 DEBUG oslo_concurrency.lockutils [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.647 2 DEBUG oslo_concurrency.lockutils [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "49fa1fe9-fac0-491b-a5d9-ee6230be2b1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.647 2 DEBUG nova.compute.manager [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] No waiting events found dispatching network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.647 2 WARNING nova.compute.manager [req-48b766b8-6e17-4cbd-9841-dfce81cce8c3 req-007a5178-0e30-4694-bc77-1f7f92b35a86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received unexpected event network-vif-plugged-08103e16-9b92-4dc8-a0e5-925d87c4fbfd for instance with vm_state deleted and task_state None.
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.698 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Successfully updated port: 6c5de877-502e-47f1-b767-d9d472689330 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.730 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.731 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.731 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.791 2 DEBUG nova.compute.manager [req-42bcc379-fc95-478f-9f23-daa807225e49 req-92805f3f-be04-4a66-abc9-ea1c3e749c1c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Received event network-vif-deleted-08103e16-9b92-4dc8-a0e5-925d87c4fbfd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.891 2 INFO nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Creating config drive at /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.config
Sep 30 21:51:14 compute-0 nova_compute[192810]: 2025-09-30 21:51:14.895 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmwhp4gel execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:14 compute-0 unix_chkpwd[247873]: password check failed for user (root)
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.021 2 DEBUG oslo_concurrency.processutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmwhp4gel" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:15 compute-0 kernel: tapf8816163-e9: entered promiscuous mode
Sep 30 21:51:15 compute-0 NetworkManager[51733]: <info>  [1759269075.0771] manager: (tapf8816163-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/298)
Sep 30 21:51:15 compute-0 ovn_controller[94912]: 2025-09-30T21:51:15Z|00668|binding|INFO|Claiming lport f8816163-e9f2-4e5b-8430-35d972cba5ec for this chassis.
Sep 30 21:51:15 compute-0 ovn_controller[94912]: 2025-09-30T21:51:15Z|00669|binding|INFO|f8816163-e9f2-4e5b-8430-35d972cba5ec: Claiming fa:16:3e:5f:48:ea 10.100.0.2
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:15.089 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:15.090 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 bound to our chassis
Sep 30 21:51:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:15.091 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:15 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:15.092 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a48c7645-23fd-481d-a8ce-5cdd3b51a327]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:15 compute-0 systemd-udevd[247887]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:15 compute-0 ovn_controller[94912]: 2025-09-30T21:51:15Z|00670|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec ovn-installed in OVS
Sep 30 21:51:15 compute-0 ovn_controller[94912]: 2025-09-30T21:51:15Z|00671|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec up in Southbound
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:15 compute-0 NetworkManager[51733]: <info>  [1759269075.1191] device (tapf8816163-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:51:15 compute-0 systemd-machined[152794]: New machine qemu-82-instance-000000ae.
Sep 30 21:51:15 compute-0 NetworkManager[51733]: <info>  [1759269075.1214] device (tapf8816163-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:51:15 compute-0 systemd[1]: Started Virtual Machine qemu-82-instance-000000ae.
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.827 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.903 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269075.902619, 92360dac-27fd-4b9d-a568-c4679e939762 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.903 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Started (Lifecycle Event)
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.921 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.925 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269075.9028685, 92360dac-27fd-4b9d-a568-c4679e939762 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.925 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Paused (Lifecycle Event)
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.960 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.962 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.996 2 DEBUG nova.compute.manager [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-changed-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.996 2 DEBUG nova.compute.manager [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing instance network info cache due to event network-changed-25422545-2c61-41ab-9982-ca7761a24544. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.996 2 DEBUG oslo_concurrency.lockutils [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:15 compute-0 nova_compute[192810]: 2025-09-30 21:51:15.997 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.838 2 DEBUG nova.compute.manager [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.839 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.839 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.839 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.839 2 DEBUG nova.compute.manager [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Processing event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.840 2 DEBUG nova.compute.manager [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.840 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.840 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.840 2 DEBUG oslo_concurrency.lockutils [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.840 2 DEBUG nova.compute.manager [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.841 2 WARNING nova.compute.manager [req-c413dd53-ecad-44ab-9b6b-06443a611f6f req-862c1902-4767-4f38-a2c8-b04544c194fe dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state building and task_state spawning.
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.841 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.845 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269076.8451915, 92360dac-27fd-4b9d-a568-c4679e939762 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.845 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Resumed (Lifecycle Event)
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.847 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.850 2 INFO nova.virt.libvirt.driver [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance spawned successfully.
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.850 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:51:16 compute-0 sshd-session[247613]: Failed password for root from 8.210.178.40 port 55688 ssh2
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.879 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.884 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.885 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.886 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.886 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.887 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.887 2 DEBUG nova.virt.libvirt.driver [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.892 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.929 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.982 2 INFO nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Took 7.20 seconds to spawn the instance on the hypervisor.
Sep 30 21:51:16 compute-0 nova_compute[192810]: 2025-09-30 21:51:16.982 2 DEBUG nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:17 compute-0 nova_compute[192810]: 2025-09-30 21:51:17.084 2 INFO nova.compute.manager [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Took 8.28 seconds to build instance.
Sep 30 21:51:17 compute-0 nova_compute[192810]: 2025-09-30 21:51:17.113 2 DEBUG oslo_concurrency.lockutils [None req-26507c8a-30a2-487e-9aec-91effd782c7f 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.785s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:17 compute-0 nova_compute[192810]: 2025-09-30 21:51:17.256 2 DEBUG nova.network.neutron [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updated VIF entry in instance network info cache for port f8816163-e9f2-4e5b-8430-35d972cba5ec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:17 compute-0 nova_compute[192810]: 2025-09-30 21:51:17.257 2 DEBUG nova.network.neutron [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updating instance_info_cache with network_info: [{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:17 compute-0 nova_compute[192810]: 2025-09-30 21:51:17.278 2 DEBUG oslo_concurrency.lockutils [req-300fd847-dc35-4a97-a561-3be1bc82203f req-e6d237eb-8f80-4e4a-8918-1c372629f9e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:18 compute-0 nova_compute[192810]: 2025-09-30 21:51:18.114 2 DEBUG nova.compute.manager [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-changed-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:18 compute-0 nova_compute[192810]: 2025-09-30 21:51:18.115 2 DEBUG nova.compute.manager [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing instance network info cache due to event network-changed-6c5de877-502e-47f1-b767-d9d472689330. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:18 compute-0 nova_compute[192810]: 2025-09-30 21:51:18.115 2 DEBUG oslo_concurrency.lockutils [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:18 compute-0 nova_compute[192810]: 2025-09-30 21:51:18.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:18 compute-0 sshd-session[247613]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 55688 ssh2 [preauth]
Sep 30 21:51:18 compute-0 sshd-session[247613]: Disconnecting authenticating user root 8.210.178.40 port 55688: Too many authentication failures [preauth]
Sep 30 21:51:18 compute-0 sshd-session[247613]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:18 compute-0 sshd-session[247613]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.881 2 DEBUG nova.network.neutron [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.913 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.913 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance network_info: |[{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.914 2 DEBUG oslo_concurrency.lockutils [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.914 2 DEBUG nova.network.neutron [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing network info cache for port 25422545-2c61-41ab-9982-ca7761a24544 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.917 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Start _get_guest_xml network_info=[{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.921 2 WARNING nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.925 2 DEBUG nova.virt.libvirt.host [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.927 2 DEBUG nova.virt.libvirt.host [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.932 2 DEBUG nova.virt.libvirt.host [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.933 2 DEBUG nova.virt.libvirt.host [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.934 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.934 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.934 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.935 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.935 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.935 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.935 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.935 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.936 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.936 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.936 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.936 2 DEBUG nova.virt.hardware [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.940 2 DEBUG nova.virt.libvirt.vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.940 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.941 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.941 2 DEBUG nova.virt.libvirt.vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.942 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.942 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.944 2 DEBUG nova.objects.instance [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid 06072d75-591c-4422-8b92-2176427d6b4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.969 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <uuid>06072d75-591c-4422-8b92-2176427d6b4d</uuid>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <name>instance-000000ad</name>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-26189983</nova:name>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:51:19</nova:creationTime>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:port uuid="25422545-2c61-41ab-9982-ca7761a24544">
Sep 30 21:51:19 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         <nova:port uuid="6c5de877-502e-47f1-b767-d9d472689330">
Sep 30 21:51:19 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe31:ed85" ipVersion="6"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <system>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="serial">06072d75-591c-4422-8b92-2176427d6b4d</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="uuid">06072d75-591c-4422-8b92-2176427d6b4d</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </system>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <os>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </os>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <features>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </features>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.config"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:b0:c4:93"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <target dev="tap25422545-2c"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:31:ed:85"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <target dev="tap6c5de877-50"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/console.log" append="off"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <video>
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </video>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:51:19 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:51:19 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:51:19 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:51:19 compute-0 nova_compute[192810]: </domain>
Sep 30 21:51:19 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.970 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Preparing to wait for external event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.970 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.970 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.971 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.971 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Preparing to wait for external event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.971 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.971 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.971 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.972 2 DEBUG nova.virt.libvirt.vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.972 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.973 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.973 2 DEBUG os_vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.974 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.979 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25422545-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25422545-2c, col_values=(('external_ids', {'iface-id': '25422545-2c61-41ab-9982-ca7761a24544', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:c4:93', 'vm-uuid': '06072d75-591c-4422-8b92-2176427d6b4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 NetworkManager[51733]: <info>  [1759269079.9821] manager: (tap25422545-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.988 2 INFO os_vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c')
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.988 2 DEBUG nova.virt.libvirt.vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:51:09Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.989 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.989 2 DEBUG nova.network.os_vif_util [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.990 2 DEBUG os_vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c5de877-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c5de877-50, col_values=(('external_ids', {'iface-id': '6c5de877-502e-47f1-b767-d9d472689330', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:31:ed:85', 'vm-uuid': '06072d75-591c-4422-8b92-2176427d6b4d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:19 compute-0 NetworkManager[51733]: <info>  [1759269079.9959] manager: (tap6c5de877-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Sep 30 21:51:19 compute-0 nova_compute[192810]: 2025-09-30 21:51:19.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.002 2 INFO os_vif [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50')
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.054 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.055 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.055 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:b0:c4:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.056 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:31:ed:85, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.056 2 INFO nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Using config drive
Sep 30 21:51:20 compute-0 unix_chkpwd[247911]: password check failed for user (root)
Sep 30 21:51:20 compute-0 sshd-session[247904]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.598 2 INFO nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Creating config drive at /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.config
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.609 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhp2y_1u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.738 2 DEBUG oslo_concurrency.processutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmhp2y_1u" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.8020] manager: (tap25422545-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/301)
Sep 30 21:51:20 compute-0 kernel: tap25422545-2c: entered promiscuous mode
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00672|binding|INFO|Claiming lport 25422545-2c61-41ab-9982-ca7761a24544 for this chassis.
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00673|binding|INFO|25422545-2c61-41ab-9982-ca7761a24544: Claiming fa:16:3e:b0:c4:93 10.100.0.4
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.8263] manager: (tap6c5de877-50): new Tun device (/org/freedesktop/NetworkManager/Devices/302)
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.826 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c4:93 10.100.0.4'], port_security=['fa:16:3e:b0:c4:93 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '06072d75-591c-4422-8b92-2176427d6b4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79474bd3-ac2c-4f66-83f8-3a487e22d9d3, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=25422545-2c61-41ab-9982-ca7761a24544) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.828 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 25422545-2c61-41ab-9982-ca7761a24544 in datapath 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc bound to our chassis
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.829 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:51:20 compute-0 systemd-udevd[247931]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:20 compute-0 systemd-udevd[247932]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.8551] device (tap25422545-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.8571] device (tap25422545-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.857 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d6e107c3-45d1-4ede-8fb7-9155ba86f6af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.857 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4c85d4d4-31 in ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.859 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4c85d4d4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.859 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5b8796c5-7818-4557-977f-80f624b31e14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.861 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[05660456-2b35-4293-abb4-7d7243fc6a91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.871 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c05b19-de66-4318-ab09-3e98f67fb97b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 systemd-machined[152794]: New machine qemu-83-instance-000000ad.
Sep 30 21:51:20 compute-0 systemd[1]: Started Virtual Machine qemu-83-instance-000000ad.
Sep 30 21:51:20 compute-0 kernel: tap6c5de877-50: entered promiscuous mode
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.9004] device (tap6c5de877-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.9029] device (tap6c5de877-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00674|binding|INFO|Claiming lport 6c5de877-502e-47f1-b767-d9d472689330 for this chassis.
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.901 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[af76db92-c63f-4d17-bb65-dc65724e0bb4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00675|binding|INFO|6c5de877-502e-47f1-b767-d9d472689330: Claiming fa:16:3e:31:ed:85 2001:db8::f816:3eff:fe31:ed85
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00676|binding|INFO|Setting lport 25422545-2c61-41ab-9982-ca7761a24544 ovn-installed in OVS
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00677|binding|INFO|Setting lport 25422545-2c61-41ab-9982-ca7761a24544 up in Southbound
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.917 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:ed:85 2001:db8::f816:3eff:fe31:ed85'], port_security=['fa:16:3e:31:ed:85 2001:db8::f816:3eff:fe31:ed85'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe31:ed85/64', 'neutron:device_id': '06072d75-591c-4422-8b92-2176427d6b4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ebdd42c-51b5-4b83-8a22-52988a595a24, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6c5de877-502e-47f1-b767-d9d472689330) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00678|binding|INFO|Setting lport 6c5de877-502e-47f1-b767-d9d472689330 ovn-installed in OVS
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.922 2 DEBUG nova.objects.instance [None req-8958e1cb-e283-4a05-a5de-c4d2267cee3e 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:20 compute-0 ovn_controller[94912]: 2025-09-30T21:51:20Z|00679|binding|INFO|Setting lport 6c5de877-502e-47f1-b767-d9d472689330 up in Southbound
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.933 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[191a5d45-cdd1-4a76-841a-c8c31d24c05c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.941 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5d595846-aed2-4708-b496-8fcab770356a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.9423] manager: (tap4c85d4d4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/303)
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.955 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269080.9549084, 92360dac-27fd-4b9d-a568-c4679e939762 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.955 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Paused (Lifecycle Event)
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.973 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[20bf0869-b830-4264-b0d2-eca69e774f84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:20.976 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c76204f3-0ce2-4f07-9b46-ed7094a9ef85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.981 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:20 compute-0 nova_compute[192810]: 2025-09-30 21:51:20.988 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:20 compute-0 NetworkManager[51733]: <info>  [1759269080.9958] device (tap4c85d4d4-30): carrier: link connected
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.004 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[812e990b-d20b-4bde-869b-5f7d2c8ed9fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.010 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.020 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d1441bde-2f78-452b-a772-833e73693fa4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c85d4d4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:b3:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577661, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 247971, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.035 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ea0c53-1a3c-411a-a40f-ee69c49ad550]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4d:b3ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577661, 'tstamp': 577661}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 247972, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.053 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c613135e-86b0-440b-b075-88e5266f5c17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c85d4d4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4d:b3:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577661, 'reachable_time': 21144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 247973, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.082 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bf56951c-ccd7-4421-b74b-092c00093b99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.135 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[380a1830-8ac6-4471-a94a-02f6fab9796c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.137 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c85d4d4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.137 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.137 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c85d4d4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:21 compute-0 kernel: tap4c85d4d4-30: entered promiscuous mode
Sep 30 21:51:21 compute-0 NetworkManager[51733]: <info>  [1759269081.1401] manager: (tap4c85d4d4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.142 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c85d4d4-30, col_values=(('external_ids', {'iface-id': 'edbccbea-ad79-490c-a6b0-46510606db95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:21 compute-0 ovn_controller[94912]: 2025-09-30T21:51:21Z|00680|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.144 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.145 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c16119a4-a4f5-4625-8ba0-84ec8c8b8ec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.146 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.pid.haproxy
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.147 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'env', 'PROCESS_TAG=haproxy-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.257 2 DEBUG nova.compute.manager [req-871288bf-9ae5-4232-b213-ebd31f8e8718 req-749426c7-c399-4e80-ad3f-d9a8198708bc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.258 2 DEBUG oslo_concurrency.lockutils [req-871288bf-9ae5-4232-b213-ebd31f8e8718 req-749426c7-c399-4e80-ad3f-d9a8198708bc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.258 2 DEBUG oslo_concurrency.lockutils [req-871288bf-9ae5-4232-b213-ebd31f8e8718 req-749426c7-c399-4e80-ad3f-d9a8198708bc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.259 2 DEBUG oslo_concurrency.lockutils [req-871288bf-9ae5-4232-b213-ebd31f8e8718 req-749426c7-c399-4e80-ad3f-d9a8198708bc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.259 2 DEBUG nova.compute.manager [req-871288bf-9ae5-4232-b213-ebd31f8e8718 req-749426c7-c399-4e80-ad3f-d9a8198708bc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Processing event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.280 2 DEBUG nova.compute.manager [req-fa830e0c-cab6-44a0-8449-c1902e04f6b3 req-04425c9a-2423-4e1e-b79e-4ad2594b125e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.281 2 DEBUG oslo_concurrency.lockutils [req-fa830e0c-cab6-44a0-8449-c1902e04f6b3 req-04425c9a-2423-4e1e-b79e-4ad2594b125e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.281 2 DEBUG oslo_concurrency.lockutils [req-fa830e0c-cab6-44a0-8449-c1902e04f6b3 req-04425c9a-2423-4e1e-b79e-4ad2594b125e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.281 2 DEBUG oslo_concurrency.lockutils [req-fa830e0c-cab6-44a0-8449-c1902e04f6b3 req-04425c9a-2423-4e1e-b79e-4ad2594b125e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.282 2 DEBUG nova.compute.manager [req-fa830e0c-cab6-44a0-8449-c1902e04f6b3 req-04425c9a-2423-4e1e-b79e-4ad2594b125e dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Processing event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:51:21 compute-0 kernel: tapf8816163-e9 (unregistering): left promiscuous mode
Sep 30 21:51:21 compute-0 NetworkManager[51733]: <info>  [1759269081.4805] device (tapf8816163-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:21 compute-0 ovn_controller[94912]: 2025-09-30T21:51:21Z|00681|binding|INFO|Releasing lport f8816163-e9f2-4e5b-8430-35d972cba5ec from this chassis (sb_readonly=0)
Sep 30 21:51:21 compute-0 ovn_controller[94912]: 2025-09-30T21:51:21Z|00682|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec down in Southbound
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 ovn_controller[94912]: 2025-09-30T21:51:21Z|00683|binding|INFO|Removing iface tapf8816163-e9 ovn-installed in OVS
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.504 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:21 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Sep 30 21:51:21 compute-0 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000ae.scope: Consumed 4.983s CPU time.
Sep 30 21:51:21 compute-0 systemd-machined[152794]: Machine qemu-82-instance-000000ae terminated.
Sep 30 21:51:21 compute-0 podman[248013]: 2025-09-30 21:51:21.548552271 +0000 UTC m=+0.090145579 container create def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:51:21 compute-0 podman[248013]: 2025-09-30 21:51:21.484953482 +0000 UTC m=+0.026546810 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:51:21 compute-0 systemd[1]: Started libpod-conmon-def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec.scope.
Sep 30 21:51:21 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:51:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1c891a36d8d8426f4ee9b195cafff3117ea5eb85451e2bdfebf1901e9dcae64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:51:21 compute-0 podman[248013]: 2025-09-30 21:51:21.665371502 +0000 UTC m=+0.206964840 container init def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:51:21 compute-0 podman[248013]: 2025-09-30 21:51:21.674347314 +0000 UTC m=+0.215940622 container start def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 21:51:21 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [NOTICE]   (248035) : New worker (248046) forked
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.700 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269081.699107, 06072d75-591c-4422-8b92-2176427d6b4d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:21 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [NOTICE]   (248035) : Loading success.
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.701 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] VM Started (Lifecycle Event)
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.703 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.723 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.728 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.732 2 INFO nova.virt.libvirt.driver [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance spawned successfully.
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.733 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.735 2 DEBUG nova.compute.manager [None req-8958e1cb-e283-4a05-a5de-c4d2267cee3e 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.737 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.758 2 DEBUG nova.network.neutron [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updated VIF entry in instance network info cache for port 25422545-2c61-41ab-9982-ca7761a24544. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.758 2 DEBUG nova.network.neutron [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.779 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6c5de877-502e-47f1-b767-d9d472689330 in datapath cd6da069-7a88-49b7-bea7-1ceb7132f614 unbound from our chassis
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.780 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.782 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.782 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269081.6992066, 06072d75-591c-4422-8b92-2176427d6b4d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.782 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] VM Paused (Lifecycle Event)
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.787 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.787 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.788 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.788 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.789 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.789 2 DEBUG nova.virt.libvirt.driver [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.793 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[784293af-48f2-4693-b0dd-5b2c6c192621]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.794 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapcd6da069-71 in ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.796 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapcd6da069-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.796 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7d99611d-2c9d-44cd-8135-9fc30e26f219]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.797 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c8bdd872-74e2-4da5-8925-37a6ebef9f49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.807 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[02fe328f-e175-4f5d-be4d-644da8c61ba4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.823 2 DEBUG oslo_concurrency.lockutils [req-274453f2-036a-4ed5-99e7-f98e0e2391b9 req-134cfa37-29e1-431c-8309-338a2e778809 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.824 2 DEBUG oslo_concurrency.lockutils [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.825 2 DEBUG nova.network.neutron [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing network info cache for port 6c5de877-502e-47f1-b767-d9d472689330 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.827 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.830 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269081.713679, 06072d75-591c-4422-8b92-2176427d6b4d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.830 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e35a801e-6570-47e1-b586-af0dd8476a7c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.830 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] VM Resumed (Lifecycle Event)
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.859 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[02de1cb6-281f-48ef-ab52-619f421838d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 systemd-udevd[247956]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:21 compute-0 NetworkManager[51733]: <info>  [1759269081.8681] manager: (tapcd6da069-70): new Veth device (/org/freedesktop/NetworkManager/Devices/305)
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.867 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c46ad9c2-00fa-4201-81f0-4f687b9cb28d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.882 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.886 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.900 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c7372df2-e539-4fc0-a971-ad9b2216156a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.903 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[6490edce-498e-45a8-81df-82a115e1ddfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.923 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:51:21 compute-0 NetworkManager[51733]: <info>  [1759269081.9264] device (tapcd6da069-70): carrier: link connected
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.933 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[47f0523b-ef46-45ab-a615-acc5f94803b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.940 2 INFO nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Took 12.45 seconds to spawn the instance on the hypervisor.
Sep 30 21:51:21 compute-0 nova_compute[192810]: 2025-09-30 21:51:21.940 2 DEBUG nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.952 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0afe729f-be11-4683-9393-4d2e3fae558c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd6da069-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:39:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577755, 'reachable_time': 35703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248074, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.968 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4542cbe8-c367-4fff-93f1-d86ab138b992]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:3971'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577755, 'tstamp': 577755}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248075, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:21.985 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[de47e35f-7a99-4881-8bcc-9a988276a0a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapcd6da069-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:39:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577755, 'reachable_time': 35703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248076, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.020 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f2f511-70cd-4645-b632-d7db7def4b71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.031 2 INFO nova.compute.manager [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Took 13.71 seconds to build instance.
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.047 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[daad2371-4fba-4bea-9596-7098872a141c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.048 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6da069-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.048 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.049 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcd6da069-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:22 compute-0 NetworkManager[51733]: <info>  [1759269082.0513] manager: (tapcd6da069-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/306)
Sep 30 21:51:22 compute-0 kernel: tapcd6da069-70: entered promiscuous mode
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.056 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapcd6da069-70, col_values=(('external_ids', {'iface-id': 'e5655641-a4f8-4024-be6c-065dbfac4615'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:22 compute-0 ovn_controller[94912]: 2025-09-30T21:51:22Z|00684|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.062 2 DEBUG oslo_concurrency.lockutils [None req-44d2de2a-232c-4cf0-a7f1-6b9d38aed047 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.073 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.074 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dd162ca8-33ee-4eb4-9edd-20cbfb1e9495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.074 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/cd6da069-7a88-49b7-bea7-1ceb7132f614.pid.haproxy
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID cd6da069-7a88-49b7-bea7-1ceb7132f614
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.075 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'env', 'PROCESS_TAG=haproxy-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/cd6da069-7a88-49b7-bea7-1ceb7132f614.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:51:22 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:22 compute-0 podman[248106]: 2025-09-30 21:51:22.441320417 +0000 UTC m=+0.065593369 container create 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:22 compute-0 systemd[1]: Started libpod-conmon-7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129.scope.
Sep 30 21:51:22 compute-0 podman[248106]: 2025-09-30 21:51:22.397440948 +0000 UTC m=+0.021713890 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:51:22 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:51:22 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc082b0ee0d5f1400b9c26adbe7f9a06bf5ff4af2ec5d8e7c3687cd157a57538/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:51:22 compute-0 podman[248106]: 2025-09-30 21:51:22.539077565 +0000 UTC m=+0.163350487 container init 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:51:22 compute-0 podman[248106]: 2025-09-30 21:51:22.545145125 +0000 UTC m=+0.169418047 container start 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:51:22 compute-0 podman[248124]: 2025-09-30 21:51:22.555606555 +0000 UTC m=+0.063887758 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:22 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [NOTICE]   (248182) : New worker (248188) forked
Sep 30 21:51:22 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [NOTICE]   (248182) : Loading success.
Sep 30 21:51:22 compute-0 podman[248123]: 2025-09-30 21:51:22.573624872 +0000 UTC m=+0.085559175 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.598 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 unbound from our chassis
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.600 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:22.601 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[39bf7bd4-34fc-4b46-a9d6-3d5974d80917]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:22 compute-0 podman[248120]: 2025-09-30 21:51:22.60976872 +0000 UTC m=+0.122748769 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:51:22 compute-0 ovn_controller[94912]: 2025-09-30T21:51:22Z|00685|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:51:22 compute-0 ovn_controller[94912]: 2025-09-30T21:51:22Z|00686|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:51:22 compute-0 nova_compute[192810]: 2025-09-30 21:51:22.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.394 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.395 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.395 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.395 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.396 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.396 2 WARNING nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received unexpected event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 for instance with vm_state active and task_state None.
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.396 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.396 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.397 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.397 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.397 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.398 2 WARNING nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.398 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.398 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.398 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.399 2 DEBUG oslo_concurrency.lockutils [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.399 2 DEBUG nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.399 2 WARNING nova.compute.manager [req-9ef421be-cf25-458d-bd81-c5797059f670 req-fae66079-8502-45ee-811d-7d9574ff663d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.411 2 DEBUG nova.compute.manager [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.412 2 DEBUG oslo_concurrency.lockutils [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.412 2 DEBUG oslo_concurrency.lockutils [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.412 2 DEBUG oslo_concurrency.lockutils [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.412 2 DEBUG nova.compute.manager [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.413 2 WARNING nova.compute.manager [req-3de7630a-8dcc-4435-bbb4-f76fb478eca5 req-e923f505-0750-416b-8af1-59d17a37b0e6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received unexpected event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 for instance with vm_state active and task_state None.
Sep 30 21:51:23 compute-0 nova_compute[192810]: 2025-09-30 21:51:23.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.123 2 DEBUG nova.network.neutron [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updated VIF entry in instance network info cache for port 6c5de877-502e-47f1-b767-d9d472689330. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.124 2 DEBUG nova.network.neutron [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.151 2 INFO nova.compute.manager [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Resuming
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.152 2 DEBUG nova.objects.instance [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'flavor' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.186 2 DEBUG oslo_concurrency.lockutils [req-2bfc6c0b-b2da-41bb-b831-35c5b46f0ee8 req-d028e806-5920-4633-b33a-c4214f873eac dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.212 2 DEBUG oslo_concurrency.lockutils [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.213 2 DEBUG oslo_concurrency.lockutils [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquired lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.213 2 DEBUG nova.network.neutron [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:51:24 compute-0 unix_chkpwd[248201]: password check failed for user (root)
Sep 30 21:51:24 compute-0 nova_compute[192810]: 2025-09-30 21:51:24.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:25 compute-0 NetworkManager[51733]: <info>  [1759269085.0249] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Sep 30 21:51:25 compute-0 NetworkManager[51733]: <info>  [1759269085.0258] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Sep 30 21:51:25 compute-0 ovn_controller[94912]: 2025-09-30T21:51:25Z|00687|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:51:25 compute-0 ovn_controller[94912]: 2025-09-30T21:51:25Z|00688|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.562 2 DEBUG nova.compute.manager [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-changed-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.563 2 DEBUG nova.compute.manager [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing instance network info cache due to event network-changed-25422545-2c61-41ab-9982-ca7761a24544. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.563 2 DEBUG oslo_concurrency.lockutils [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.563 2 DEBUG oslo_concurrency.lockutils [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:25 compute-0 nova_compute[192810]: 2025-09-30 21:51:25.564 2 DEBUG nova.network.neutron [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing network info cache for port 25422545-2c61-41ab-9982-ca7761a24544 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:51:26 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:26 compute-0 nova_compute[192810]: 2025-09-30 21:51:26.956 2 DEBUG nova.network.neutron [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updating instance_info_cache with network_info: [{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:26 compute-0 nova_compute[192810]: 2025-09-30 21:51:26.985 2 DEBUG oslo_concurrency.lockutils [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Releasing lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.029 2 DEBUG nova.virt.libvirt.vif [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1327581824',display_name='tempest-TestServerAdvancedOps-server-1327581824',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1327581824',id=174,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e9cc864ebd6746dc8aae9d41edaa3753',ramdisk_id='',reservation_id='r-2lfte01b',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-517934233',owner_user_name='tempest-TestServerAdvancedOps-517934233-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:21Z,user_data=None,user_id='11769cb7bff84bcc8c33eaff10e3e1ba',uuid=92360dac-27fd-4b9d-a568-c4679e939762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.031 2 DEBUG nova.network.os_vif_util [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converting VIF {"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.032 2 DEBUG nova.network.os_vif_util [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.032 2 DEBUG os_vif [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.033 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.034 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8816163-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf8816163-e9, col_values=(('external_ids', {'iface-id': 'f8816163-e9f2-4e5b-8430-35d972cba5ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:48:ea', 'vm-uuid': '92360dac-27fd-4b9d-a568-c4679e939762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.038 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.038 2 INFO os_vif [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9')
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.063 2 DEBUG nova.objects.instance [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'numa_topology' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:27 compute-0 NetworkManager[51733]: <info>  [1759269087.1485] manager: (tapf8816163-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Sep 30 21:51:27 compute-0 kernel: tapf8816163-e9: entered promiscuous mode
Sep 30 21:51:27 compute-0 ovn_controller[94912]: 2025-09-30T21:51:27Z|00689|binding|INFO|Claiming lport f8816163-e9f2-4e5b-8430-35d972cba5ec for this chassis.
Sep 30 21:51:27 compute-0 ovn_controller[94912]: 2025-09-30T21:51:27Z|00690|binding|INFO|f8816163-e9f2-4e5b-8430-35d972cba5ec: Claiming fa:16:3e:5f:48:ea 10.100.0.2
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 systemd-udevd[248219]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:27 compute-0 systemd-machined[152794]: New machine qemu-84-instance-000000ae.
Sep 30 21:51:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:27.183 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:27.185 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 bound to our chassis
Sep 30 21:51:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:27.186 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:27 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:27.187 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[822b67bc-20f8-43fb-97fb-f2e02ebddc56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:27 compute-0 NetworkManager[51733]: <info>  [1759269087.1897] device (tapf8816163-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:51:27 compute-0 NetworkManager[51733]: <info>  [1759269087.1908] device (tapf8816163-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 ovn_controller[94912]: 2025-09-30T21:51:27Z|00691|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec ovn-installed in OVS
Sep 30 21:51:27 compute-0 ovn_controller[94912]: 2025-09-30T21:51:27Z|00692|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec up in Southbound
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:27 compute-0 systemd[1]: Started Virtual Machine qemu-84-instance-000000ae.
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.284 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269072.2828846, 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.285 2 INFO nova.compute.manager [-] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] VM Stopped (Lifecycle Event)
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.314 2 DEBUG nova.compute.manager [None req-157f350a-94a7-4df7-982c-953d54abeccf - - - - - -] [instance: 49fa1fe9-fac0-491b-a5d9-ee6230be2b1f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.860 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 92360dac-27fd-4b9d-a568-c4679e939762 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.861 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269087.8599885, 92360dac-27fd-4b9d-a568-c4679e939762 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.861 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Started (Lifecycle Event)
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.881 2 DEBUG nova.compute.manager [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.881 2 DEBUG nova.objects.instance [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.885 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.889 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.899 2 INFO nova.virt.libvirt.driver [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance running successfully.
Sep 30 21:51:27 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.902 2 DEBUG nova.virt.libvirt.guest [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.902 2 DEBUG nova.compute.manager [None req-c0d83966-0497-4d0f-9319-07cd8f7e3bce 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.929 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.930 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269087.8705037, 92360dac-27fd-4b9d-a568-c4679e939762 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.930 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Resumed (Lifecycle Event)
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.963 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:27 compute-0 nova_compute[192810]: 2025-09-30 21:51:27.966 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:28 compute-0 ovn_controller[94912]: 2025-09-30T21:51:28Z|00693|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:51:28 compute-0 ovn_controller[94912]: 2025-09-30T21:51:28Z|00694|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:51:28 compute-0 nova_compute[192810]: 2025-09-30 21:51:28.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:28 compute-0 nova_compute[192810]: 2025-09-30 21:51:28.451 2 DEBUG nova.network.neutron [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updated VIF entry in instance network info cache for port 25422545-2c61-41ab-9982-ca7761a24544. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:51:28 compute-0 nova_compute[192810]: 2025-09-30 21:51:28.452 2 DEBUG nova.network.neutron [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:28 compute-0 nova_compute[192810]: 2025-09-30 21:51:28.492 2 DEBUG oslo_concurrency.lockutils [req-00db00c4-7cce-44e5-a401-44c22c2bce89 req-e56ec250-8444-48d6-99bb-6bab00be7862 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:28 compute-0 nova_compute[192810]: 2025-09-30 21:51:28.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:28 compute-0 unix_chkpwd[248236]: password check failed for user (root)
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.426 2 DEBUG nova.objects.instance [None req-4cfbd929-0642-4681-96b1-80ec63fd98eb 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.447 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269089.4470391, 92360dac-27fd-4b9d-a568-c4679e939762 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.448 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Paused (Lifecycle Event)
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.469 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.476 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.501 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:51:29 compute-0 kernel: tapf8816163-e9 (unregistering): left promiscuous mode
Sep 30 21:51:29 compute-0 NetworkManager[51733]: <info>  [1759269089.9192] device (tapf8816163-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:29 compute-0 ovn_controller[94912]: 2025-09-30T21:51:29Z|00695|binding|INFO|Releasing lport f8816163-e9f2-4e5b-8430-35d972cba5ec from this chassis (sb_readonly=0)
Sep 30 21:51:29 compute-0 ovn_controller[94912]: 2025-09-30T21:51:29Z|00696|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec down in Southbound
Sep 30 21:51:29 compute-0 ovn_controller[94912]: 2025-09-30T21:51:29Z|00697|binding|INFO|Removing iface tapf8816163-e9 ovn-installed in OVS
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:29.939 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:29.940 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 unbound from our chassis
Sep 30 21:51:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:29.941 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:29.943 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[033d39f5-4b4b-440e-b0c4-fea3344b2fa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:29 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Sep 30 21:51:29 compute-0 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000ae.scope: Consumed 2.202s CPU time.
Sep 30 21:51:29 compute-0 systemd-machined[152794]: Machine qemu-84-instance-000000ae terminated.
Sep 30 21:51:29 compute-0 nova_compute[192810]: 2025-09-30 21:51:29.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.151 2 DEBUG nova.compute.manager [None req-4cfbd929-0642-4681-96b1-80ec63fd98eb 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.412 2 DEBUG nova.compute.manager [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.412 2 DEBUG oslo_concurrency.lockutils [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.413 2 DEBUG oslo_concurrency.lockutils [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.413 2 DEBUG oslo_concurrency.lockutils [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.413 2 DEBUG nova.compute.manager [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:30 compute-0 nova_compute[192810]: 2025-09-30 21:51:30.413 2 WARNING nova.compute.manager [req-295ce3fd-67b1-43e0-9409-f2bc20bae19f req-c7d5ca8f-b274-4503-beed-e21b40b8b566 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:31 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:32 compute-0 podman[248261]: 2025-09-30 21:51:32.320657622 +0000 UTC m=+0.055473668 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:51:32 compute-0 podman[248262]: 2025-09-30 21:51:32.321975445 +0000 UTC m=+0.056331220 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:32 compute-0 unix_chkpwd[248306]: password check failed for user (root)
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.585 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.585 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 WARNING nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.586 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 WARNING nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.587 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.588 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.588 2 DEBUG oslo_concurrency.lockutils [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.588 2 DEBUG nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:32 compute-0 nova_compute[192810]: 2025-09-30 21:51:32.588 2 WARNING nova.compute.manager [req-f32d552e-689e-4e41-b931-2ce3503d4444 req-70e01c7e-6bb9-401f-adc9-10b90ef922ad dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state None.
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.274 2 INFO nova.compute.manager [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Resuming
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.274 2 DEBUG nova.objects.instance [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'flavor' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.458 2 DEBUG oslo_concurrency.lockutils [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.458 2 DEBUG oslo_concurrency.lockutils [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquired lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.458 2 DEBUG nova.network.neutron [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:51:33 compute-0 nova_compute[192810]: 2025-09-30 21:51:33.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:33 compute-0 ovn_controller[94912]: 2025-09-30T21:51:33Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:c4:93 10.100.0.4
Sep 30 21:51:33 compute-0 ovn_controller[94912]: 2025-09-30T21:51:33Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:c4:93 10.100.0.4
Sep 30 21:51:34 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.436 2 DEBUG nova.network.neutron [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updating instance_info_cache with network_info: [{"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.456 2 DEBUG oslo_concurrency.lockutils [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Releasing lock "refresh_cache-92360dac-27fd-4b9d-a568-c4679e939762" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.459 2 DEBUG nova.virt.libvirt.vif [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1327581824',display_name='tempest-TestServerAdvancedOps-server-1327581824',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1327581824',id=174,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='e9cc864ebd6746dc8aae9d41edaa3753',ramdisk_id='',reservation_id='r-2lfte01b',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestServerAdvancedOps-517934233',owner_user_name='tempest-TestServerAdvancedOps-517934233-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:30Z,user_data=None,user_id='11769cb7bff84bcc8c33eaff10e3e1ba',uuid=92360dac-27fd-4b9d-a568-c4679e939762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.460 2 DEBUG nova.network.os_vif_util [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converting VIF {"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.460 2 DEBUG nova.network.os_vif_util [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.460 2 DEBUG os_vif [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf8816163-e9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf8816163-e9, col_values=(('external_ids', {'iface-id': 'f8816163-e9f2-4e5b-8430-35d972cba5ec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:48:ea', 'vm-uuid': '92360dac-27fd-4b9d-a568-c4679e939762'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.464 2 INFO os_vif [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9')
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.485 2 DEBUG nova.objects.instance [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'numa_topology' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:35 compute-0 kernel: tapf8816163-e9: entered promiscuous mode
Sep 30 21:51:35 compute-0 NetworkManager[51733]: <info>  [1759269095.5510] manager: (tapf8816163-e9): new Tun device (/org/freedesktop/NetworkManager/Devices/310)
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 ovn_controller[94912]: 2025-09-30T21:51:35Z|00698|binding|INFO|Claiming lport f8816163-e9f2-4e5b-8430-35d972cba5ec for this chassis.
Sep 30 21:51:35 compute-0 ovn_controller[94912]: 2025-09-30T21:51:35Z|00699|binding|INFO|f8816163-e9f2-4e5b-8430-35d972cba5ec: Claiming fa:16:3e:5f:48:ea 10.100.0.2
Sep 30 21:51:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:35.568 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '7', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:35 compute-0 ovn_controller[94912]: 2025-09-30T21:51:35Z|00700|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec ovn-installed in OVS
Sep 30 21:51:35 compute-0 ovn_controller[94912]: 2025-09-30T21:51:35Z|00701|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec up in Southbound
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:35.569 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 bound to our chassis
Sep 30 21:51:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:35.570 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:35 compute-0 nova_compute[192810]: 2025-09-30 21:51:35.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:35.572 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[72d98c3c-a739-4468-8a5a-9b8b006658b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:35 compute-0 systemd-udevd[248343]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:51:35 compute-0 systemd-machined[152794]: New machine qemu-85-instance-000000ae.
Sep 30 21:51:35 compute-0 NetworkManager[51733]: <info>  [1759269095.5971] device (tapf8816163-e9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:51:35 compute-0 NetworkManager[51733]: <info>  [1759269095.5983] device (tapf8816163-e9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:51:35 compute-0 systemd[1]: Started Virtual Machine qemu-85-instance-000000ae.
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.065 2 DEBUG nova.compute.manager [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.066 2 DEBUG oslo_concurrency.lockutils [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.066 2 DEBUG oslo_concurrency.lockutils [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.066 2 DEBUG oslo_concurrency.lockutils [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.066 2 DEBUG nova.compute.manager [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.066 2 WARNING nova.compute.manager [req-8e6b5747-63fb-456c-b07a-b3ceaa139648 req-31ad2a08-40e2-4707-8f75-531b921bef30 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state suspended and task_state resuming.
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.285 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 92360dac-27fd-4b9d-a568-c4679e939762 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.285 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269096.284725, 92360dac-27fd-4b9d-a568-c4679e939762 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.285 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Started (Lifecycle Event)
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.308 2 DEBUG nova.compute.manager [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.308 2 DEBUG nova.objects.instance [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'pci_devices' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.314 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.317 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.326 2 INFO nova.virt.libvirt.driver [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance running successfully.
Sep 30 21:51:36 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.329 2 DEBUG nova.virt.libvirt.guest [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.330 2 DEBUG nova.compute.manager [None req-483edcb2-36ed-4b41-99ad-838b49b3d879 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.354 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.355 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269096.292291, 92360dac-27fd-4b9d-a568-c4679e939762 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.355 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Resumed (Lifecycle Event)
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.385 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.387 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:51:36 compute-0 nova_compute[192810]: 2025-09-30 21:51:36.426 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:51:36 compute-0 unix_chkpwd[248359]: password check failed for user (root)
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.713 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.713 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.713 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.713 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.714 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.724 2 INFO nova.compute.manager [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Terminating instance
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.734 2 DEBUG nova.compute.manager [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:51:37 compute-0 kernel: tapf8816163-e9 (unregistering): left promiscuous mode
Sep 30 21:51:37 compute-0 NetworkManager[51733]: <info>  [1759269097.7530] device (tapf8816163-e9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:37 compute-0 ovn_controller[94912]: 2025-09-30T21:51:37Z|00702|binding|INFO|Releasing lport f8816163-e9f2-4e5b-8430-35d972cba5ec from this chassis (sb_readonly=0)
Sep 30 21:51:37 compute-0 ovn_controller[94912]: 2025-09-30T21:51:37Z|00703|binding|INFO|Setting lport f8816163-e9f2-4e5b-8430-35d972cba5ec down in Southbound
Sep 30 21:51:37 compute-0 ovn_controller[94912]: 2025-09-30T21:51:37Z|00704|binding|INFO|Removing iface tapf8816163-e9 ovn-installed in OVS
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:37.775 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:48:ea 10.100.0.2'], port_security=['fa:16:3e:5f:48:ea 10.100.0.2'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': '92360dac-27fd-4b9d-a568-c4679e939762', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0eb83f8f-b5f5-45c8-b684-88cef476aa77', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e9cc864ebd6746dc8aae9d41edaa3753', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8d894498-b287-4558-85ab-ca9a26ef74bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bef4e0af-8132-4565-9bbb-def95e4b0a48, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=f8816163-e9f2-4e5b-8430-35d972cba5ec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:37 compute-0 nova_compute[192810]: 2025-09-30 21:51:37.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:37.776 103867 INFO neutron.agent.ovn.metadata.agent [-] Port f8816163-e9f2-4e5b-8430-35d972cba5ec in datapath 0eb83f8f-b5f5-45c8-b684-88cef476aa77 unbound from our chassis
Sep 30 21:51:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:37.777 103867 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0eb83f8f-b5f5-45c8-b684-88cef476aa77 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Sep 30 21:51:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:37.778 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9b678f-a7eb-47e9-adf0-e2d54915ad55]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:51:37 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000ae.scope: Deactivated successfully.
Sep 30 21:51:37 compute-0 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000ae.scope: Consumed 2.055s CPU time.
Sep 30 21:51:37 compute-0 systemd-machined[152794]: Machine qemu-85-instance-000000ae terminated.
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.000 2 INFO nova.virt.libvirt.driver [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Instance destroyed successfully.
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.000 2 DEBUG nova.objects.instance [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lazy-loading 'resources' on Instance uuid 92360dac-27fd-4b9d-a568-c4679e939762 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.022 2 DEBUG nova.virt.libvirt.vif [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerAdvancedOps-server-1327581824',display_name='tempest-TestServerAdvancedOps-server-1327581824',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserveradvancedops-server-1327581824',id=174,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e9cc864ebd6746dc8aae9d41edaa3753',ramdisk_id='',reservation_id='r-2lfte01b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerAdvancedOps-517934233',owner_user_name='tempest-TestServerAdvancedOps-517934233-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:36Z,user_data=None,user_id='11769cb7bff84bcc8c33eaff10e3e1ba',uuid=92360dac-27fd-4b9d-a568-c4679e939762,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.022 2 DEBUG nova.network.os_vif_util [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converting VIF {"id": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "address": "fa:16:3e:5f:48:ea", "network": {"id": "0eb83f8f-b5f5-45c8-b684-88cef476aa77", "bridge": "br-int", "label": "tempest-TestServerAdvancedOps-1825927682-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "e9cc864ebd6746dc8aae9d41edaa3753", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf8816163-e9", "ovs_interfaceid": "f8816163-e9f2-4e5b-8430-35d972cba5ec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.023 2 DEBUG nova.network.os_vif_util [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.023 2 DEBUG os_vif [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.025 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf8816163-e9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.030 2 INFO os_vif [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:48:ea,bridge_name='br-int',has_traffic_filtering=True,id=f8816163-e9f2-4e5b-8430-35d972cba5ec,network=Network(0eb83f8f-b5f5-45c8-b684-88cef476aa77),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf8816163-e9')
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.031 2 INFO nova.virt.libvirt.driver [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Deleting instance files /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762_del
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.031 2 INFO nova.virt.libvirt.driver [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Deletion of /var/lib/nova/instances/92360dac-27fd-4b9d-a568-c4679e939762_del complete
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.109 2 INFO nova.compute.manager [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Took 0.37 seconds to destroy the instance on the hypervisor.
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.109 2 DEBUG oslo.service.loopingcall [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.109 2 DEBUG nova.compute.manager [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.109 2 DEBUG nova.network.neutron [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.238 2 DEBUG nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.239 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.239 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.239 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.239 2 DEBUG nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.239 2 WARNING nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state active and task_state deleting.
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.240 2 DEBUG nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.240 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.240 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.240 2 DEBUG oslo_concurrency.lockutils [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.241 2 DEBUG nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.241 2 DEBUG nova.compute.manager [req-96889df4-99ff-40aa-8443-dd6e539a2bc6 req-f2963db5-cc51-4231-a0b5-138517032d9c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-unplugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:38.758 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:38.759 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.902 2 DEBUG nova.network.neutron [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:51:38 compute-0 nova_compute[192810]: 2025-09-30 21:51:38.940 2 INFO nova.compute.manager [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Took 0.83 seconds to deallocate network for instance.
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.045 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.045 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.073 2 DEBUG nova.compute.manager [req-7049f1e9-59fa-4dd4-b902-400355fcb406 req-f49f5717-17c1-43d4-a45a-13809e62c1c0 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-deleted-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.133 2 DEBUG nova.compute.provider_tree [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:51:39 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.191 2 DEBUG nova.scheduler.client.report [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.224 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.254 2 INFO nova.scheduler.client.report [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Deleted allocations for instance 92360dac-27fd-4b9d-a568-c4679e939762
Sep 30 21:51:39 compute-0 podman[248382]: 2025-09-30 21:51:39.322978345 +0000 UTC m=+0.056536014 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:51:39 compute-0 podman[248383]: 2025-09-30 21:51:39.324454522 +0000 UTC m=+0.057116029 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:51:39 compute-0 nova_compute[192810]: 2025-09-30 21:51:39.339 2 DEBUG oslo_concurrency.lockutils [None req-b468a305-011b-46b0-a08a-b6a931fd06fc 11769cb7bff84bcc8c33eaff10e3e1ba e9cc864ebd6746dc8aae9d41edaa3753 - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:39 compute-0 podman[248381]: 2025-09-30 21:51:39.354307103 +0000 UTC m=+0.087875193 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.413 2 DEBUG nova.compute.manager [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.413 2 DEBUG oslo_concurrency.lockutils [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "92360dac-27fd-4b9d-a568-c4679e939762-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.413 2 DEBUG oslo_concurrency.lockutils [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.414 2 DEBUG oslo_concurrency.lockutils [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "92360dac-27fd-4b9d-a568-c4679e939762-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.414 2 DEBUG nova.compute.manager [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] No waiting events found dispatching network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.414 2 WARNING nova.compute.manager [req-77def278-470a-4184-96bb-4e19b6df582a req-9d3ac481-0c88-4e68-b54e-a8853b2b707c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Received unexpected event network-vif-plugged-f8816163-e9f2-4e5b-8430-35d972cba5ec for instance with vm_state deleted and task_state None.
Sep 30 21:51:40 compute-0 unix_chkpwd[248442]: password check failed for user (root)
Sep 30 21:51:40 compute-0 ovn_controller[94912]: 2025-09-30T21:51:40Z|00705|binding|INFO|Releasing lport e5655641-a4f8-4024-be6c-065dbfac4615 from this chassis (sb_readonly=0)
Sep 30 21:51:40 compute-0 ovn_controller[94912]: 2025-09-30T21:51:40Z|00706|binding|INFO|Releasing lport edbccbea-ad79-490c-a6b0-46510606db95 from this chassis (sb_readonly=0)
Sep 30 21:51:40 compute-0 nova_compute[192810]: 2025-09-30 21:51:40.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:41 compute-0 nova_compute[192810]: 2025-09-30 21:51:41.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:43 compute-0 sshd-session[247904]: Failed password for root from 8.210.178.40 port 56374 ssh2
Sep 30 21:51:43 compute-0 nova_compute[192810]: 2025-09-30 21:51:43.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:43 compute-0 nova_compute[192810]: 2025-09-30 21:51:43.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:44 compute-0 sshd-session[247904]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 56374 ssh2 [preauth]
Sep 30 21:51:44 compute-0 sshd-session[247904]: Disconnecting authenticating user root 8.210.178.40 port 56374: Too many authentication failures [preauth]
Sep 30 21:51:44 compute-0 sshd-session[247904]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:44 compute-0 sshd-session[247904]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:51:46 compute-0 unix_chkpwd[248446]: password check failed for user (root)
Sep 30 21:51:46 compute-0 sshd-session[248444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:51:48 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:51:48 compute-0 nova_compute[192810]: 2025-09-30 21:51:48.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:48 compute-0 unix_chkpwd[248447]: password check failed for user (root)
Sep 30 21:51:48 compute-0 nova_compute[192810]: 2025-09-30 21:51:48.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:49 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 246795
Sep 30 21:51:50 compute-0 nova_compute[192810]: 2025-09-30 21:51:50.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:51 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:51:51 compute-0 nova_compute[192810]: 2025-09-30 21:51:51.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:51.583 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:51:51 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:51.586 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:51:52 compute-0 unix_chkpwd[248448]: password check failed for user (root)
Sep 30 21:51:52 compute-0 nova_compute[192810]: 2025-09-30 21:51:52.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:52 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:51:52.587 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:51:53 compute-0 nova_compute[192810]: 2025-09-30 21:51:53.000 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269097.9988587, 92360dac-27fd-4b9d-a568-c4679e939762 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:51:53 compute-0 nova_compute[192810]: 2025-09-30 21:51:53.001 2 INFO nova.compute.manager [-] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] VM Stopped (Lifecycle Event)
Sep 30 21:51:53 compute-0 nova_compute[192810]: 2025-09-30 21:51:53.030 2 DEBUG nova.compute.manager [None req-b95f1d8a-0bb6-4b13-ac76-9bf85a8948c8 - - - - - -] [instance: 92360dac-27fd-4b9d-a568-c4679e939762] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:51:53 compute-0 nova_compute[192810]: 2025-09-30 21:51:53.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:53 compute-0 podman[248450]: 2025-09-30 21:51:53.329478512 +0000 UTC m=+0.059702444 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:51:53 compute-0 podman[248449]: 2025-09-30 21:51:53.369218198 +0000 UTC m=+0.099027799 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2)
Sep 30 21:51:53 compute-0 podman[248451]: 2025-09-30 21:51:53.385424771 +0000 UTC m=+0.115325455 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923)
Sep 30 21:51:53 compute-0 nova_compute[192810]: 2025-09-30 21:51:53.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:54 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:51:55 compute-0 nova_compute[192810]: 2025-09-30 21:51:55.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:56 compute-0 nova_compute[192810]: 2025-09-30 21:51:56.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:56 compute-0 unix_chkpwd[248513]: password check failed for user (root)
Sep 30 21:51:58 compute-0 nova_compute[192810]: 2025-09-30 21:51:58.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:51:58 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:51:58 compute-0 unix_chkpwd[248514]: password check failed for user (root)
Sep 30 21:51:58 compute-0 nova_compute[192810]: 2025-09-30 21:51:58.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:01 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:52:02 compute-0 nova_compute[192810]: 2025-09-30 21:52:02.283 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:02 compute-0 unix_chkpwd[248515]: password check failed for user (root)
Sep 30 21:52:02 compute-0 nova_compute[192810]: 2025-09-30 21:52:02.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:02 compute-0 nova_compute[192810]: 2025-09-30 21:52:02.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:52:03 compute-0 nova_compute[192810]: 2025-09-30 21:52:03.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:03 compute-0 podman[248516]: 2025-09-30 21:52:03.330167302 +0000 UTC m=+0.058933425 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:52:03 compute-0 podman[248517]: 2025-09-30 21:52:03.331131366 +0000 UTC m=+0.056420632 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64)
Sep 30 21:52:03 compute-0 nova_compute[192810]: 2025-09-30 21:52:03.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:04 compute-0 sshd-session[248444]: Failed password for root from 8.210.178.40 port 57070 ssh2
Sep 30 21:52:04 compute-0 sshd-session[248444]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 57070 ssh2 [preauth]
Sep 30 21:52:04 compute-0 sshd-session[248444]: Disconnecting authenticating user root 8.210.178.40 port 57070: Too many authentication failures [preauth]
Sep 30 21:52:04 compute-0 sshd-session[248444]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:04 compute-0 sshd-session[248444]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:04 compute-0 nova_compute[192810]: 2025-09-30 21:52:04.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:04 compute-0 nova_compute[192810]: 2025-09-30 21:52:04.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.597 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.598 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.733 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.958 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.958 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.964 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:52:05 compute-0 nova_compute[192810]: 2025-09-30 21:52:05.965 2 INFO nova.compute.claims [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:52:06 compute-0 unix_chkpwd[248563]: password check failed for user (root)
Sep 30 21:52:06 compute-0 sshd-session[248561]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.179 2 DEBUG nova.scheduler.client.report [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.245 2 DEBUG nova.scheduler.client.report [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.245 2 DEBUG nova.compute.provider_tree [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.257 2 DEBUG nova.scheduler.client.report [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.275 2 DEBUG nova.scheduler.client.report [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.345 2 DEBUG nova.compute.provider_tree [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.368 2 DEBUG nova.scheduler.client.report [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.434 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.476s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.435 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.574 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.575 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.616 2 INFO nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.725 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.910 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.912 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.913 2 INFO nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating image(s)
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.915 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.915 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.916 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.932 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.987 2 DEBUG nova.policy [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2a25e9b373084bfa9bc194a299a3ac4a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f69d27778a54a45a6527a19643d1fe0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.990 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.991 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:06 compute-0 nova_compute[192810]: 2025-09-30 21:52:06.991 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.005 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.110 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.111 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.200 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk 1073741824" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.201 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.202 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.295 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.296 2 DEBUG nova.virt.disk.api [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Checking if we can resize image /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.296 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.356 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.357 2 DEBUG nova.virt.disk.api [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Cannot resize image /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.357 2 DEBUG nova.objects.instance [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'migration_context' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.389 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.390 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Ensure instance console log exists: /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.390 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.391 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:07 compute-0 nova_compute[192810]: 2025-09-30 21:52:07.391 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:08 compute-0 nova_compute[192810]: 2025-09-30 21:52:08.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:08 compute-0 nova_compute[192810]: 2025-09-30 21:52:08.136 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Successfully created port: 6f02d077-990b-41dc-87e5-d4dfb5b0397d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:52:08 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:08 compute-0 nova_compute[192810]: 2025-09-30 21:52:08.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.207 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Successfully updated port: 6f02d077-990b-41dc-87e5-d4dfb5b0397d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.255 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.256 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.256 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.359 2 DEBUG nova.compute.manager [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.360 2 DEBUG nova.compute.manager [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing instance network info cache due to event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.360 2 DEBUG oslo_concurrency.lockutils [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.562 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:52:09 compute-0 nova_compute[192810]: 2025-09-30 21:52:09.824 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.080 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.081 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.081 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.081 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06072d75-591c-4422-8b92-2176427d6b4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:10 compute-0 unix_chkpwd[248579]: password check failed for user (root)
Sep 30 21:52:10 compute-0 podman[248580]: 2025-09-30 21:52:10.314659723 +0000 UTC m=+0.052311750 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:52:10 compute-0 podman[248582]: 2025-09-30 21:52:10.317078793 +0000 UTC m=+0.048443474 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:52:10 compute-0 podman[248581]: 2025-09-30 21:52:10.348650757 +0000 UTC m=+0.071828075 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.930 2 DEBUG nova.network.neutron [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.990 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.991 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance network_info: |[{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.991 2 DEBUG oslo_concurrency.lockutils [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.991 2 DEBUG nova.network.neutron [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.993 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start _get_guest_xml network_info=[{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:52:10 compute-0 nova_compute[192810]: 2025-09-30 21:52:10.997 2 WARNING nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.002 2 DEBUG nova.virt.libvirt.host [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.003 2 DEBUG nova.virt.libvirt.host [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.007 2 DEBUG nova.virt.libvirt.host [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.007 2 DEBUG nova.virt.libvirt.host [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.009 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.009 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.009 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.010 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.010 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.010 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.011 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.011 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.011 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.011 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.012 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.012 2 DEBUG nova.virt.hardware [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.016 2 DEBUG nova.virt.libvirt.vif [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLqJH0XUHnV+DFJ4EEdvUcnnJtAN3d0qXBSayXC/1r7r1r6VLAmC4Ityxcv3hObBaxsXAuhiLyJJeNSDzDDKD8CyDvitwtpXo1jlbjfL2MJDye78MFsuQz3G/vPFK9b4zw==',key_name='tempest-TestShelveInstance-1916761761',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:06Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.016 2 DEBUG nova.network.os_vif_util [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.017 2 DEBUG nova.network.os_vif_util [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.018 2 DEBUG nova.objects.instance [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.072 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <uuid>f346a37a-3dd5-4a2b-a445-9a0fe47a9194</uuid>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <name>instance-000000b1</name>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:name>tempest-TestShelveInstance-server-144130882</nova:name>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:52:10</nova:creationTime>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:user uuid="2a25e9b373084bfa9bc194a299a3ac4a">tempest-TestShelveInstance-1185384137-project-member</nova:user>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:project uuid="8f69d27778a54a45a6527a19643d1fe0">tempest-TestShelveInstance-1185384137</nova:project>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         <nova:port uuid="6f02d077-990b-41dc-87e5-d4dfb5b0397d">
Sep 30 21:52:11 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <system>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="serial">f346a37a-3dd5-4a2b-a445-9a0fe47a9194</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="uuid">f346a37a-3dd5-4a2b-a445-9a0fe47a9194</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </system>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <os>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </os>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <features>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </features>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:ee:93:c6"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <target dev="tap6f02d077-99"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/console.log" append="off"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <video>
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </video>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:52:11 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:52:11 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:52:11 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:52:11 compute-0 nova_compute[192810]: </domain>
Sep 30 21:52:11 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.074 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Preparing to wait for external event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.074 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.074 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.075 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.075 2 DEBUG nova.virt.libvirt.vif [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLqJH0XUHnV+DFJ4EEdvUcnnJtAN3d0qXBSayXC/1r7r1r6VLAmC4Ityxcv3hObBaxsXAuhiLyJJeNSDzDDKD8CyDvitwtpXo1jlbjfL2MJDye78MFsuQz3G/vPFK9b4zw==',key_name='tempest-TestShelveInstance-1916761761',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:06Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.076 2 DEBUG nova.network.os_vif_util [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.076 2 DEBUG nova.network.os_vif_util [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.077 2 DEBUG os_vif [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.077 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f02d077-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.080 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6f02d077-99, col_values=(('external_ids', {'iface-id': '6f02d077-990b-41dc-87e5-d4dfb5b0397d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:93:c6', 'vm-uuid': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:11 compute-0 NetworkManager[51733]: <info>  [1759269131.0932] manager: (tap6f02d077-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.100 2 INFO os_vif [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99')
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.163 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.163 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.164 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No VIF found with MAC fa:16:3e:ee:93:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.164 2 INFO nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Using config drive
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.883 2 INFO nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating config drive at /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config
Sep 30 21:52:11 compute-0 nova_compute[192810]: 2025-09-30 21:52:11.888 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdm3dcygh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.015 2 DEBUG oslo_concurrency.processutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdm3dcygh" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:12 compute-0 kernel: tap6f02d077-99: entered promiscuous mode
Sep 30 21:52:12 compute-0 ovn_controller[94912]: 2025-09-30T21:52:12Z|00707|binding|INFO|Claiming lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d for this chassis.
Sep 30 21:52:12 compute-0 ovn_controller[94912]: 2025-09-30T21:52:12Z|00708|binding|INFO|6f02d077-990b-41dc-87e5-d4dfb5b0397d: Claiming fa:16:3e:ee:93:c6 10.100.0.13
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.0757] manager: (tap6f02d077-99): new Tun device (/org/freedesktop/NetworkManager/Devices/312)
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.088 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:93:c6 10.100.0.13'], port_security=['fa:16:3e:ee:93:c6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f69d27778a54a45a6527a19643d1fe0', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a1f2b095-bc71-4c21-8e62-a47f9c684104', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cee6e55-2dcb-411d-b86b-9118cc3f4c81, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6f02d077-990b-41dc-87e5-d4dfb5b0397d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.090 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6f02d077-990b-41dc-87e5-d4dfb5b0397d in datapath 37abdced-e5ee-484f-bdda-8b1a7b19a7f0 bound to our chassis
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.092 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:12 compute-0 ovn_controller[94912]: 2025-09-30T21:52:12Z|00709|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d ovn-installed in OVS
Sep 30 21:52:12 compute-0 ovn_controller[94912]: 2025-09-30T21:52:12Z|00710|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d up in Southbound
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.105 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a05d5b27-fc02-4b5b-837b-22a37c0d392e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.106 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap37abdced-e1 in ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:52:12 compute-0 systemd-udevd[248663]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.108 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap37abdced-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.109 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5401ac74-5415-45c1-9f8b-257696689c7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.109 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[16825a78-22d0-4046-bd71-8883c8833171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.119 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[07cfd84e-a116-4ccd-a039-e598354fd23b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.1225] device (tap6f02d077-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.1238] device (tap6f02d077-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:52:12 compute-0 systemd-machined[152794]: New machine qemu-86-instance-000000b1.
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.136 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8117ae38-a302-4ef3-9214-1d7cb671ab8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 systemd[1]: Started Virtual Machine qemu-86-instance-000000b1.
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.170 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e29f63ba-dbad-41b2-a864-1e133c18d24b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 systemd-udevd[248667]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.1794] manager: (tap37abdced-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/313)
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.178 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b70823bd-48a1-4c0a-b406-00aafab446bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.218 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[291205ee-8df5-4961-b5d6-23741a55630b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.222 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[7624ff98-acf8-4259-be2c-35a7b4e6c348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.2502] device (tap37abdced-e0): carrier: link connected
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.256 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[02986877-4d94-4cb8-a192-ac42b3fdce3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.273 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ab74ad7b-6360-4794-8627-eb7fa2bfcbb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37abdced-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:17:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582787, 'reachable_time': 33200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248696, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.286 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9691c91e-eb0b-495e-9646-59629f7ada1f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:1739'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 582787, 'tstamp': 582787}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248697, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.303 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[62d2d142-ce4f-4c41-892f-6203f07f232b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37abdced-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:17:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 213], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582787, 'reachable_time': 33200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248698, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.339 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e338c247-ecc8-4f73-8f1b-fa8f02eeeb71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.420 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[07873cdc-8711-4a37-b6d4-a2526e6b26f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.425 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37abdced-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.426 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.428 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37abdced-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:12 compute-0 NetworkManager[51733]: <info>  [1759269132.4330] manager: (tap37abdced-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/314)
Sep 30 21:52:12 compute-0 kernel: tap37abdced-e0: entered promiscuous mode
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.436 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37abdced-e0, col_values=(('external_ids', {'iface-id': 'cf7ed38e-459c-4152-addb-db64052a2be8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:12 compute-0 ovn_controller[94912]: 2025-09-30T21:52:12Z|00711|binding|INFO|Releasing lport cf7ed38e-459c-4152-addb-db64052a2be8 from this chassis (sb_readonly=0)
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.439 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.452 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[74e4db50-6375-4d76-a90f-bf798e32b3b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.454 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:52:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:12.455 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'env', 'PROCESS_TAG=haproxy-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.524 2 DEBUG nova.compute.manager [req-c7a91ece-b019-4a80-bc90-0042802b847c req-1b11aeb5-6f3d-4ecb-bc41-ff1d00a9038f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.524 2 DEBUG oslo_concurrency.lockutils [req-c7a91ece-b019-4a80-bc90-0042802b847c req-1b11aeb5-6f3d-4ecb-bc41-ff1d00a9038f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.525 2 DEBUG oslo_concurrency.lockutils [req-c7a91ece-b019-4a80-bc90-0042802b847c req-1b11aeb5-6f3d-4ecb-bc41-ff1d00a9038f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.525 2 DEBUG oslo_concurrency.lockutils [req-c7a91ece-b019-4a80-bc90-0042802b847c req-1b11aeb5-6f3d-4ecb-bc41-ff1d00a9038f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:12 compute-0 nova_compute[192810]: 2025-09-30 21:52:12.525 2 DEBUG nova.compute.manager [req-c7a91ece-b019-4a80-bc90-0042802b847c req-1b11aeb5-6f3d-4ecb-bc41-ff1d00a9038f dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Processing event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:52:12 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:12 compute-0 podman[248730]: 2025-09-30 21:52:12.831582684 +0000 UTC m=+0.051545861 container create 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 21:52:12 compute-0 systemd[1]: Started libpod-conmon-96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed.scope.
Sep 30 21:52:12 compute-0 podman[248730]: 2025-09-30 21:52:12.80325126 +0000 UTC m=+0.023214447 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:52:12 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:52:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e2f842d9f86dae412d508fca824ce032bc689912070bf98cd5d565e176728481/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:52:12 compute-0 podman[248730]: 2025-09-30 21:52:12.920247545 +0000 UTC m=+0.140210752 container init 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:52:12 compute-0 podman[248730]: 2025-09-30 21:52:12.925103566 +0000 UTC m=+0.145066753 container start 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:52:12 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [NOTICE]   (248755) : New worker (248757) forked
Sep 30 21:52:12 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [NOTICE]   (248755) : Loading success.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.012 2 DEBUG nova.network.neutron [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated VIF entry in instance network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.012 2 DEBUG nova.network.neutron [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.044 2 DEBUG oslo_concurrency.lockutils [req-8b4282a1-49e9-4ac9-bd9c-4e3cab77db69 req-fe7426d2-4d5f-470d-86b5-593ffad823ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.361 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.361 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269133.3606834, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.362 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Started (Lifecycle Event)
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.366 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.369 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance spawned successfully.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.369 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.401 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.405 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.407 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.407 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.408 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.408 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.408 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.409 2 DEBUG nova.virt.libvirt.driver [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.435 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.435 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269133.3632104, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.436 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Paused (Lifecycle Event)
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.470 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.472 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269133.3652883, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.473 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Resumed (Lifecycle Event)
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.511 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.513 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.543 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.557 2 INFO nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Took 6.65 seconds to spawn the instance on the hypervisor.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.557 2 DEBUG nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.736 2 INFO nova.compute.manager [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Took 7.83 seconds to build instance.
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.767 2 DEBUG oslo_concurrency.lockutils [None req-84c179f7-affb-4b30-b790-6206c43ff90d 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.926 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.953 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.954 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.954 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.955 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.955 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.956 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.984 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.985 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.986 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:13 compute-0 nova_compute[192810]: 2025-09-30 21:52:13.986 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:52:14 compute-0 unix_chkpwd[248768]: password check failed for user (root)
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.102 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.159 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.160 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.218 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.227 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.296 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.297 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.374 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.594 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.596 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5448MB free_disk=73.1995849609375GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.596 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.597 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.619 2 DEBUG nova.compute.manager [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.620 2 DEBUG oslo_concurrency.lockutils [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.620 2 DEBUG oslo_concurrency.lockutils [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.620 2 DEBUG oslo_concurrency.lockutils [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.620 2 DEBUG nova.compute.manager [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.621 2 WARNING nova.compute.manager [req-70759e6d-196f-4062-ab9a-d7bf5fbfb62b req-662ecbe5-5a58-40c8-9e54-2d53786c1a80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received unexpected event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with vm_state active and task_state None.
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.671 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 06072d75-591c-4422-8b92-2176427d6b4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.672 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.672 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.672 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.744 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.773 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.834 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:52:14 compute-0 nova_compute[192810]: 2025-09-30 21:52:14.835 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:16 compute-0 nova_compute[192810]: 2025-09-30 21:52:16.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:16 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:17 compute-0 nova_compute[192810]: 2025-09-30 21:52:17.831 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:17 compute-0 nova_compute[192810]: 2025-09-30 21:52:17.832 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:52:18 compute-0 unix_chkpwd[248781]: password check failed for user (root)
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.172 2 DEBUG nova.compute.manager [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.172 2 DEBUG nova.compute.manager [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing instance network info cache due to event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.172 2 DEBUG oslo_concurrency.lockutils [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.172 2 DEBUG oslo_concurrency.lockutils [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.173 2 DEBUG nova.network.neutron [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:18 compute-0 nova_compute[192810]: 2025-09-30 21:52:18.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:19 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:20 compute-0 unix_chkpwd[248782]: password check failed for user (root)
Sep 30 21:52:20 compute-0 nova_compute[192810]: 2025-09-30 21:52:20.241 2 DEBUG nova.network.neutron [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated VIF entry in instance network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:20 compute-0 nova_compute[192810]: 2025-09-30 21:52:20.241 2 DEBUG nova.network.neutron [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:20 compute-0 nova_compute[192810]: 2025-09-30 21:52:20.323 2 DEBUG oslo_concurrency.lockutils [req-98f83594-1278-4d74-82bb-c75612a68ba9 req-22710477-2a35-4bd7-8df6-a04bf19be283 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:21 compute-0 nova_compute[192810]: 2025-09-30 21:52:21.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:21 compute-0 ovn_controller[94912]: 2025-09-30T21:52:21Z|00712|memory|INFO|peak resident set size grew 50% in last 3144.1 seconds, from 16256 kB to 24392 kB
Sep 30 21:52:21 compute-0 ovn_controller[94912]: 2025-09-30T21:52:21Z|00713|memory|INFO|idl-cells-OVN_Southbound:10840 idl-cells-Open_vSwitch:1041 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:383 lflow-cache-entries-cache-matches:288 lflow-cache-size-KB:1545 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:675 ofctrl_installed_flow_usage-KB:494 ofctrl_sb_flow_ref_usage-KB:253
Sep 30 21:52:21 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:22 compute-0 unix_chkpwd[248783]: password check failed for user (root)
Sep 30 21:52:23 compute-0 nova_compute[192810]: 2025-09-30 21:52:23.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:24 compute-0 podman[248785]: 2025-09-30 21:52:24.327376943 +0000 UTC m=+0.055696834 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:52:24 compute-0 podman[248786]: 2025-09-30 21:52:24.340384276 +0000 UTC m=+0.065284522 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:52:24 compute-0 podman[248784]: 2025-09-30 21:52:24.426007781 +0000 UTC m=+0.156298191 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:24 compute-0 sshd-session[248561]: Failed password for root from 8.210.178.40 port 57618 ssh2
Sep 30 21:52:26 compute-0 sshd-session[248561]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 57618 ssh2 [preauth]
Sep 30 21:52:26 compute-0 sshd-session[248561]: Disconnecting authenticating user root 8.210.178.40 port 57618: Too many authentication failures [preauth]
Sep 30 21:52:26 compute-0 sshd-session[248561]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:26 compute-0 sshd-session[248561]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:26 compute-0 nova_compute[192810]: 2025-09-30 21:52:26.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:26 compute-0 ovn_controller[94912]: 2025-09-30T21:52:26Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ee:93:c6 10.100.0.13
Sep 30 21:52:26 compute-0 ovn_controller[94912]: 2025-09-30T21:52:26Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:93:c6 10.100.0.13
Sep 30 21:52:27 compute-0 unix_chkpwd[248870]: password check failed for user (root)
Sep 30 21:52:27 compute-0 sshd-session[248868]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:28 compute-0 nova_compute[192810]: 2025-09-30 21:52:28.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:29 compute-0 nova_compute[192810]: 2025-09-30 21:52:29.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:29.983 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:29.984 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:52:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:29.985 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:31 compute-0 nova_compute[192810]: 2025-09-30 21:52:31.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:31 compute-0 unix_chkpwd[248872]: password check failed for user (root)
Sep 30 21:52:33 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:33 compute-0 nova_compute[192810]: 2025-09-30 21:52:33.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:33 compute-0 unix_chkpwd[248873]: password check failed for user (root)
Sep 30 21:52:34 compute-0 podman[248874]: 2025-09-30 21:52:34.333583417 +0000 UTC m=+0.072231625 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:52:34 compute-0 podman[248875]: 2025-09-30 21:52:34.339986536 +0000 UTC m=+0.073231530 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc.)
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.666 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.666 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.666 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.667 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.667 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.701 2 INFO nova.compute.manager [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Terminating instance
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.731 2 DEBUG nova.compute.manager [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:52:34 compute-0 kernel: tap25422545-2c (unregistering): left promiscuous mode
Sep 30 21:52:34 compute-0 NetworkManager[51733]: <info>  [1759269154.7630] device (tap25422545-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00714|binding|INFO|Releasing lport 25422545-2c61-41ab-9982-ca7761a24544 from this chassis (sb_readonly=0)
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00715|binding|INFO|Setting lport 25422545-2c61-41ab-9982-ca7761a24544 down in Southbound
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00716|binding|INFO|Removing iface tap25422545-2c ovn-installed in OVS
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.783 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:c4:93 10.100.0.4'], port_security=['fa:16:3e:b0:c4:93 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '06072d75-591c-4422-8b92-2176427d6b4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79474bd3-ac2c-4f66-83f8-3a487e22d9d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=25422545-2c61-41ab-9982-ca7761a24544) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.785 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 25422545-2c61-41ab-9982-ca7761a24544 in datapath 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc unbound from our chassis
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.787 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.788 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8983141e-cb2d-4e4d-acc0-82339dc8fcb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.788 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc namespace which is not needed anymore
Sep 30 21:52:34 compute-0 kernel: tap6c5de877-50 (unregistering): left promiscuous mode
Sep 30 21:52:34 compute-0 NetworkManager[51733]: <info>  [1759269154.8094] device (tap6c5de877-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00717|binding|INFO|Releasing lport 6c5de877-502e-47f1-b767-d9d472689330 from this chassis (sb_readonly=0)
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00718|binding|INFO|Setting lport 6c5de877-502e-47f1-b767-d9d472689330 down in Southbound
Sep 30 21:52:34 compute-0 ovn_controller[94912]: 2025-09-30T21:52:34Z|00719|binding|INFO|Removing iface tap6c5de877-50 ovn-installed in OVS
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:34 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:34.845 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:31:ed:85 2001:db8::f816:3eff:fe31:ed85'], port_security=['fa:16:3e:31:ed:85 2001:db8::f816:3eff:fe31:ed85'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe31:ed85/64', 'neutron:device_id': '06072d75-591c-4422-8b92-2176427d6b4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15ea3623-c6c2-499c-8e00-9a1f5fdaf5c2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5ebdd42c-51b5-4b83-8a22-52988a595a24, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6c5de877-502e-47f1-b767-d9d472689330) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.864 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.865 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.865 2 INFO nova.compute.manager [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Shelving
Sep 30 21:52:34 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ad.scope: Deactivated successfully.
Sep 30 21:52:34 compute-0 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000ad.scope: Consumed 14.928s CPU time.
Sep 30 21:52:34 compute-0 systemd-machined[152794]: Machine qemu-83-instance-000000ad terminated.
Sep 30 21:52:34 compute-0 nova_compute[192810]: 2025-09-30 21:52:34.961 2 DEBUG nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Sep 30 21:52:34 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [NOTICE]   (248035) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:34 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [NOTICE]   (248035) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:34 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [WARNING]  (248035) : Exiting Master process...
Sep 30 21:52:34 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [ALERT]    (248035) : Current worker (248046) exited with code 143 (Terminated)
Sep 30 21:52:34 compute-0 neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc[248029]: [WARNING]  (248035) : All workers exited. Exiting... (0)
Sep 30 21:52:34 compute-0 systemd[1]: libpod-def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec.scope: Deactivated successfully.
Sep 30 21:52:34 compute-0 NetworkManager[51733]: <info>  [1759269154.9698] manager: (tap6c5de877-50): new Tun device (/org/freedesktop/NetworkManager/Devices/315)
Sep 30 21:52:34 compute-0 podman[248945]: 2025-09-30 21:52:34.973747601 +0000 UTC m=+0.089138694 container died def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.022 2 INFO nova.virt.libvirt.driver [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance destroyed successfully.
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.023 2 DEBUG nova.objects.instance [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid 06072d75-591c-4422-8b92-2176427d6b4d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.062 2 DEBUG nova.virt.libvirt.vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:21Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.064 2 DEBUG nova.network.os_vif_util [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-c1c891a36d8d8426f4ee9b195cafff3117ea5eb85451e2bdfebf1901e9dcae64-merged.mount: Deactivated successfully.
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.066 2 DEBUG nova.network.os_vif_util [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.066 2 DEBUG os_vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.072 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25422545-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.093 2 INFO os_vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:c4:93,bridge_name='br-int',has_traffic_filtering=True,id=25422545-2c61-41ab-9982-ca7761a24544,network=Network(4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25422545-2c')
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.095 2 DEBUG nova.virt.libvirt.vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:51:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-26189983',display_name='tempest-TestGettingAddress-server-26189983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-26189983',id=173,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6IA1ZMHqMnx6vajVu3Rxu0mLHAgi9ZiEWJq0mK7b7+FNXbXiIUokQaeP2RlIYQG/rnW9lpPKmkg9fl2BJnF3yaf57+/J6ArqVMsWD0IV/NeNPPLmOErVJN8uCNukd8DA==',key_name='tempest-TestGettingAddress-1179853319',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:51:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-nr0jiinx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:51:21Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=06072d75-591c-4422-8b92-2176427d6b4d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.095 2 DEBUG nova.network.os_vif_util [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.097 2 DEBUG nova.network.os_vif_util [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.097 2 DEBUG os_vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5de877-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.108 2 INFO os_vif [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:31:ed:85,bridge_name='br-int',has_traffic_filtering=True,id=6c5de877-502e-47f1-b767-d9d472689330,network=Network(cd6da069-7a88-49b7-bea7-1ceb7132f614),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c5de877-50')
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.109 2 INFO nova.virt.libvirt.driver [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Deleting instance files /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d_del
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.110 2 INFO nova.virt.libvirt.driver [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Deletion of /var/lib/nova/instances/06072d75-591c-4422-8b92-2176427d6b4d_del complete
Sep 30 21:52:35 compute-0 podman[248945]: 2025-09-30 21:52:35.148815608 +0000 UTC m=+0.264206691 container cleanup def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:52:35 compute-0 systemd[1]: libpod-conmon-def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec.scope: Deactivated successfully.
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.224 2 INFO nova.compute.manager [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Took 0.49 seconds to destroy the instance on the hypervisor.
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.225 2 DEBUG oslo.service.loopingcall [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.225 2 DEBUG nova.compute.manager [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.225 2 DEBUG nova.network.neutron [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:52:35 compute-0 podman[249006]: 2025-09-30 21:52:35.240481144 +0000 UTC m=+0.070524182 container remove def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.246 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c8524f-511f-4c82-9e17-1a8945c36eb2]: (4, ('Tue Sep 30 09:52:34 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc (def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec)\ndef1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec\nTue Sep 30 09:52:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc (def1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec)\ndef1e7b32e2eeaa6858be2b9f530e220493f44651d18b50ebf15367f4eb03aec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.248 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[dbabee10-2d6d-4f34-b125-8e180b1680ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.249 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c85d4d4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 kernel: tap4c85d4d4-30: left promiscuous mode
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.258 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fd095f84-8ef1-4af2-9e3a-80bb6e8349b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.286 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b11c1bbc-102a-40b9-9341-18928de1c567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.287 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3b8badd7-72b1-4ce7-aa56-9e195f18b5c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.306 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[43c78235-c404-4320-a622-3a87279a8ab9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577655, 'reachable_time': 22855, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249021, 'error': None, 'target': 'ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 systemd[1]: run-netns-ovnmeta\x2d4c85d4d4\x2d3eb7\x2d42ce\x2dbfd6\x2d6de03fb781cc.mount: Deactivated successfully.
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.311 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.311 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7f5b0b0c-c991-40d6-ae35-03db706eb91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.311 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6c5de877-502e-47f1-b767-d9d472689330 in datapath cd6da069-7a88-49b7-bea7-1ceb7132f614 unbound from our chassis
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.313 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cd6da069-7a88-49b7-bea7-1ceb7132f614, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.313 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e4078a46-29cf-4c9d-be48-fe459d675cce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.314 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 namespace which is not needed anymore
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.371 2 DEBUG nova.compute.manager [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-changed-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.372 2 DEBUG nova.compute.manager [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing instance network info cache due to event network-changed-25422545-2c61-41ab-9982-ca7761a24544. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.372 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.372 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.372 2 DEBUG nova.network.neutron [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Refreshing network info cache for port 25422545-2c61-41ab-9982-ca7761a24544 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [NOTICE]   (248182) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [NOTICE]   (248182) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [WARNING]  (248182) : Exiting Master process...
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [WARNING]  (248182) : Exiting Master process...
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [ALERT]    (248182) : Current worker (248188) exited with code 143 (Terminated)
Sep 30 21:52:35 compute-0 neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614[248132]: [WARNING]  (248182) : All workers exited. Exiting... (0)
Sep 30 21:52:35 compute-0 systemd[1]: libpod-7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129.scope: Deactivated successfully.
Sep 30 21:52:35 compute-0 podman[249040]: 2025-09-30 21:52:35.462111225 +0000 UTC m=+0.067731432 container died 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc082b0ee0d5f1400b9c26adbe7f9a06bf5ff4af2ec5d8e7c3687cd157a57538-merged.mount: Deactivated successfully.
Sep 30 21:52:35 compute-0 podman[249040]: 2025-09-30 21:52:35.613670918 +0000 UTC m=+0.219291115 container cleanup 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:52:35 compute-0 systemd[1]: libpod-conmon-7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129.scope: Deactivated successfully.
Sep 30 21:52:35 compute-0 podman[249070]: 2025-09-30 21:52:35.685154463 +0000 UTC m=+0.049275574 container remove 7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.691 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[40fba5b9-2f1e-405a-bc99-1c742292f37b]: (4, ('Tue Sep 30 09:52:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 (7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129)\n7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129\nTue Sep 30 09:52:35 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 (7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129)\n7b8f6316b2eba5925630ec40fa31649f0743df55257f9b3759f700922ebcb129\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.693 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3786d1-d1a6-42db-b7d7-50cefc4e6f9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.695 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcd6da069-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:35 compute-0 kernel: tapcd6da069-70: left promiscuous mode
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 nova_compute[192810]: 2025-09-30 21:52:35.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.712 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca00d28-be3f-4973-a55e-29f1d411da7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.744 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f131d5ce-5f83-42fb-a663-257ba7b1355f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.746 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a989a7df-517a-4fbe-9e5f-6c6ee724893f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.765 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6101e970-01f5-49dd-83f7-f26f2501808f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577747, 'reachable_time': 23056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249085, 'error': None, 'target': 'ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.769 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-cd6da069-7a88-49b7-bea7-1ceb7132f614 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:35 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:35.769 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[9cca32b5-eb70-495a-a9db-2c240ca11700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:36 compute-0 systemd[1]: run-netns-ovnmeta\x2dcd6da069\x2d7a88\x2d49b7\x2dbea7\x2d1ceb7132f614.mount: Deactivated successfully.
Sep 30 21:52:36 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.359 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-unplugged-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.360 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.361 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.361 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.361 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-unplugged-25422545-2c61-41ab-9982-ca7761a24544 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.362 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-unplugged-25422545-2c61-41ab-9982-ca7761a24544 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.362 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.362 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.363 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.363 2 DEBUG oslo_concurrency.lockutils [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.363 2 DEBUG nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:36 compute-0 nova_compute[192810]: 2025-09-30 21:52:36.364 2 WARNING nova.compute.manager [req-f5e06997-eb70-45fb-8402-afe230714b6e req-5f3b8ffd-37a1-4ec5-9c30-8178f90025fa dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received unexpected event network-vif-plugged-25422545-2c61-41ab-9982-ca7761a24544 for instance with vm_state active and task_state deleting.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.017 2 DEBUG nova.network.neutron [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.036 2 INFO nova.compute.manager [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Took 1.81 seconds to deallocate network for instance.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.137 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.138 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.200 2 DEBUG nova.compute.provider_tree [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.202 2 DEBUG nova.network.neutron [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updated VIF entry in instance network info cache for port 25422545-2c61-41ab-9982-ca7761a24544. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.203 2 DEBUG nova.network.neutron [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Updating instance_info_cache with network_info: [{"id": "25422545-2c61-41ab-9982-ca7761a24544", "address": "fa:16:3e:b0:c4:93", "network": {"id": "4c85d4d4-3eb7-42ce-bfd6-6de03fb781cc", "bridge": "br-int", "label": "tempest-network-smoke--1429782878", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25422545-2c", "ovs_interfaceid": "25422545-2c61-41ab-9982-ca7761a24544", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6c5de877-502e-47f1-b767-d9d472689330", "address": "fa:16:3e:31:ed:85", "network": {"id": "cd6da069-7a88-49b7-bea7-1ceb7132f614", "bridge": "br-int", "label": "tempest-network-smoke--1719254685", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe31:ed85", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c5de877-50", "ovs_interfaceid": "6c5de877-502e-47f1-b767-d9d472689330", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.220 2 DEBUG nova.scheduler.client.report [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.224 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-06072d75-591c-4422-8b92-2176427d6b4d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.224 2 DEBUG nova.compute.manager [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-unplugged-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.224 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.225 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.225 2 DEBUG oslo_concurrency.lockutils [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.225 2 DEBUG nova.compute.manager [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-unplugged-6c5de877-502e-47f1-b767-d9d472689330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.225 2 DEBUG nova.compute.manager [req-bbc7cb3f-bb93-4fed-a3d7-403bfccec58c req-b0351b63-f0fa-4c83-90b4-ff03312f5d86 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-unplugged-6c5de877-502e-47f1-b767-d9d472689330 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.241 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:37 compute-0 kernel: tap6f02d077-99 (unregistering): left promiscuous mode
Sep 30 21:52:37 compute-0 NetworkManager[51733]: <info>  [1759269157.2576] device (tap6f02d077-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:52:37 compute-0 ovn_controller[94912]: 2025-09-30T21:52:37Z|00720|binding|INFO|Releasing lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d from this chassis (sb_readonly=0)
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 ovn_controller[94912]: 2025-09-30T21:52:37Z|00721|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d down in Southbound
Sep 30 21:52:37 compute-0 ovn_controller[94912]: 2025-09-30T21:52:37Z|00722|binding|INFO|Removing iface tap6f02d077-99 ovn-installed in OVS
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.265 2 INFO nova.scheduler.client.report [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance 06072d75-591c-4422-8b92-2176427d6b4d
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.272 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:93:c6 10.100.0.13'], port_security=['fa:16:3e:ee:93:c6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f69d27778a54a45a6527a19643d1fe0', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a1f2b095-bc71-4c21-8e62-a47f9c684104', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cee6e55-2dcb-411d-b86b-9118cc3f4c81, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6f02d077-990b-41dc-87e5-d4dfb5b0397d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.275 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6f02d077-990b-41dc-87e5-d4dfb5b0397d in datapath 37abdced-e5ee-484f-bdda-8b1a7b19a7f0 unbound from our chassis
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.278 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37abdced-e5ee-484f-bdda-8b1a7b19a7f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.279 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bfae0e41-a325-4f49-b5b7-b7bdd07e39a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.280 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 namespace which is not needed anymore
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Sep 30 21:52:37 compute-0 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b1.scope: Consumed 14.332s CPU time.
Sep 30 21:52:37 compute-0 systemd-machined[152794]: Machine qemu-86-instance-000000b1 terminated.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.359 2 DEBUG oslo_concurrency.lockutils [None req-d939b4e0-e8df-4f75-8a16-87ccce599258 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.693s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [NOTICE]   (248755) : haproxy version is 2.8.14-c23fe91
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [NOTICE]   (248755) : path to executable is /usr/sbin/haproxy
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [WARNING]  (248755) : Exiting Master process...
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [WARNING]  (248755) : Exiting Master process...
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [ALERT]    (248755) : Current worker (248757) exited with code 143 (Terminated)
Sep 30 21:52:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[248745]: [WARNING]  (248755) : All workers exited. Exiting... (0)
Sep 30 21:52:37 compute-0 systemd[1]: libpod-96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed.scope: Deactivated successfully.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.480 2 DEBUG nova.compute.manager [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.480 2 DEBUG oslo_concurrency.lockutils [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.480 2 DEBUG oslo_concurrency.lockutils [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.481 2 DEBUG oslo_concurrency.lockutils [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.481 2 DEBUG nova.compute.manager [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.481 2 WARNING nova.compute.manager [req-4800c87d-dd58-4d35-b55c-ca048e0ec5df req-c4d557eb-27c1-45e4-a888-098c071d84ef dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received unexpected event network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with vm_state active and task_state shelving.
Sep 30 21:52:37 compute-0 podman[249108]: 2025-09-30 21:52:37.484114129 +0000 UTC m=+0.118285308 container died 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.485 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.485 2 DEBUG oslo_concurrency.lockutils [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "06072d75-591c-4422-8b92-2176427d6b4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 DEBUG oslo_concurrency.lockutils [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 DEBUG oslo_concurrency.lockutils [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "06072d75-591c-4422-8b92-2176427d6b4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] No waiting events found dispatching network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 WARNING nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received unexpected event network-vif-plugged-6c5de877-502e-47f1-b767-d9d472689330 for instance with vm_state deleted and task_state None.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-deleted-6c5de877-502e-47f1-b767-d9d472689330 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.486 2 INFO nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Neutron deleted interface 6c5de877-502e-47f1-b767-d9d472689330; detaching it from the instance and deleting it from the info cache
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.487 2 DEBUG nova.network.neutron [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.489 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Detach interface failed, port_id=6c5de877-502e-47f1-b767-d9d472689330, reason: Instance 06072d75-591c-4422-8b92-2176427d6b4d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.489 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Received event network-vif-deleted-25422545-2c61-41ab-9982-ca7761a24544 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.489 2 INFO nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Neutron deleted interface 25422545-2c61-41ab-9982-ca7761a24544; detaching it from the instance and deleting it from the info cache
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.490 2 DEBUG nova.network.neutron [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.494 2 DEBUG nova.compute.manager [req-3d77251f-2b01-47b8-8a7a-e0b727175bd0 req-4b02e317-f682-4529-af21-63f61bfe3961 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Detach interface failed, port_id=25422545-2c61-41ab-9982-ca7761a24544, reason: Instance 06072d75-591c-4422-8b92-2176427d6b4d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:52:37 compute-0 NetworkManager[51733]: <info>  [1759269157.4999] manager: (tap6f02d077-99): new Tun device (/org/freedesktop/NetworkManager/Devices/316)
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed-userdata-shm.mount: Deactivated successfully.
Sep 30 21:52:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-e2f842d9f86dae412d508fca824ce032bc689912070bf98cd5d565e176728481-merged.mount: Deactivated successfully.
Sep 30 21:52:37 compute-0 podman[249108]: 2025-09-30 21:52:37.638207795 +0000 UTC m=+0.272379004 container cleanup 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 21:52:37 compute-0 systemd[1]: libpod-conmon-96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed.scope: Deactivated successfully.
Sep 30 21:52:37 compute-0 unix_chkpwd[249167]: password check failed for user (root)
Sep 30 21:52:37 compute-0 podman[249154]: 2025-09-30 21:52:37.795234533 +0000 UTC m=+0.119476007 container remove 96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.805 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8e935b18-2d75-4e06-aee3-9d65ff840e39]: (4, ('Tue Sep 30 09:52:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 (96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed)\n96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed\nTue Sep 30 09:52:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 (96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed)\n96d193df5d88bfe317057446892ad94c20c640f764995f3c187978c6094973ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.810 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1e30b810-af93-4df2-8306-521612630f69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.812 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37abdced-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 kernel: tap37abdced-e0: left promiscuous mode
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.837 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2123c860-f818-4a06-a1b0-81170de751c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.873 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0110ab24-d287-4298-8057-ac266b6dd4cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.875 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[202d49a1-89d2-4d33-9991-791d71871a6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.900 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bb0a1e-0ee8-4753-8ff8-36eb005832bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 582778, 'reachable_time': 27581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249172, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.905 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:52:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:37.905 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[cec67a55-6984-48f8-84f7-86ed3c744ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:52:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d37abdced\x2de5ee\x2d484f\x2dbdda\x2d8b1a7b19a7f0.mount: Deactivated successfully.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.985 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance shutdown successfully after 3 seconds.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.993 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance destroyed successfully.
Sep 30 21:52:37 compute-0 nova_compute[192810]: 2025-09-30 21:52:37.993 2 DEBUG nova.objects.instance [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'numa_topology' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:38 compute-0 nova_compute[192810]: 2025-09-30 21:52:38.269 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Beginning cold snapshot process
Sep 30 21:52:38 compute-0 nova_compute[192810]: 2025-09-30 21:52:38.497 2 DEBUG nova.privsep.utils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:52:38 compute-0 nova_compute[192810]: 2025-09-30 21:52:38.497 2 DEBUG oslo_concurrency.processutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk /var/lib/nova/instances/snapshots/tmpk1q6blr0/a5cd79d1b94245e48706b49376fee5a7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:38 compute-0 nova_compute[192810]: 2025-09-30 21:52:38.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:38.757 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:38.758 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:52:38.759 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.054 2 DEBUG oslo_concurrency.processutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk /var/lib/nova/instances/snapshots/tmpk1q6blr0/a5cd79d1b94245e48706b49376fee5a7" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.055 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Snapshot extracted, beginning image upload
Sep 30 21:52:39 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.568 2 DEBUG nova.compute.manager [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.569 2 DEBUG oslo_concurrency.lockutils [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.570 2 DEBUG oslo_concurrency.lockutils [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.571 2 DEBUG oslo_concurrency.lockutils [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.571 2 DEBUG nova.compute.manager [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:52:39 compute-0 nova_compute[192810]: 2025-09-30 21:52:39.572 2 WARNING nova.compute.manager [req-2d45346f-1828-4f2e-8691-98c7b72f54fc req-954a2158-b99c-4108-ac12-5ac285b88610 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received unexpected event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with vm_state active and task_state shelving_image_uploading.
Sep 30 21:52:39 compute-0 unix_chkpwd[249187]: password check failed for user (root)
Sep 30 21:52:40 compute-0 nova_compute[192810]: 2025-09-30 21:52:40.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:41 compute-0 podman[249191]: 2025-09-30 21:52:41.327669248 +0000 UTC m=+0.057698784 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:52:41 compute-0 podman[249189]: 2025-09-30 21:52:41.434831596 +0000 UTC m=+0.172193114 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 21:52:41 compute-0 podman[249190]: 2025-09-30 21:52:41.448381774 +0000 UTC m=+0.175104978 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:41 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.034 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Snapshot image upload complete
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.036 2 DEBUG nova.compute.manager [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.169 2 INFO nova.compute.manager [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Shelve offloading
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.191 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance destroyed successfully.
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.192 2 DEBUG nova.compute.manager [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.194 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.194 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.194 2 DEBUG nova.network.neutron [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:42 compute-0 nova_compute[192810]: 2025-09-30 21:52:42.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:43 compute-0 nova_compute[192810]: 2025-09-30 21:52:43.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:43 compute-0 unix_chkpwd[249253]: password check failed for user (root)
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.914 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194', 'name': 'tempest-TestShelveInstance-server-144130882', 'flavor': {'id': 'afe5c12d-500a-499b-9438-9e9c37698acc', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '86b6907c-d747-4e98-8897-42105915831d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-000000b1', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '8f69d27778a54a45a6527a19643d1fe0', 'user_id': '2a25e9b373084bfa9bc194a299a3ac4a', 'hostId': 'a246e0382736deea09e8c8fff7ed1e5758c27a64e5666820e0fa71ee', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.915 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.917 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.917 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.917 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>]
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.918 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.918 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.919 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.919 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.919 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>]
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.920 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.920 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.920 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>]
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.920 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.921 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.921 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.922 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.922 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.923 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.923 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.924 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.925 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.925 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.926 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.926 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.926 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.927 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.928 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.928 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.928 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.928 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.929 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.929 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.930 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.930 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.930 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.931 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.931 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.931 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.931 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestShelveInstance-server-144130882>]
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.932 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.932 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.932 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Sep 30 21:52:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:52:43.933 12 DEBUG ceilometer.compute.pollsters [-] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-000000b1, id=f346a37a-3dd5-4a2b-a445-9a0fe47a9194>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Sep 30 21:52:45 compute-0 nova_compute[192810]: 2025-09-30 21:52:45.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:45 compute-0 sshd-session[248868]: Failed password for root from 8.210.178.40 port 58344 ssh2
Sep 30 21:52:45 compute-0 nova_compute[192810]: 2025-09-30 21:52:45.268 2 DEBUG nova.network.neutron [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:45 compute-0 nova_compute[192810]: 2025-09-30 21:52:45.293 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:45 compute-0 sshd-session[248868]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 58344 ssh2 [preauth]
Sep 30 21:52:45 compute-0 sshd-session[248868]: Disconnecting authenticating user root 8.210.178.40 port 58344: Too many authentication failures [preauth]
Sep 30 21:52:45 compute-0 sshd-session[248868]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:45 compute-0 sshd-session[248868]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.428 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance destroyed successfully.
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.428 2 DEBUG nova.objects.instance [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'resources' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.445 2 DEBUG nova.virt.libvirt.vif [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLqJH0XUHnV+DFJ4EEdvUcnnJtAN3d0qXBSayXC/1r7r1r6VLAmC4Ityxcv3hObBaxsXAuhiLyJJeNSDzDDKD8CyDvitwtpXo1jlbjfL2MJDye78MFsuQz3G/vPFK9b4zw==',key_name='tempest-TestShelveInstance-1916761761',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member',shelved_at='2025-09-30T21:52:42.035433',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='a10325dd-c3a0-484c-b481-f0815c9caaa6'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:52:39Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.445 2 DEBUG nova.network.os_vif_util [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.446 2 DEBUG nova.network.os_vif_util [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.446 2 DEBUG os_vif [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f02d077-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.454 2 INFO os_vif [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99')
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.454 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Deleting instance files /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194_del
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.460 2 INFO nova.virt.libvirt.driver [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Deletion of /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194_del complete
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.624 2 INFO nova.scheduler.client.report [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Deleted allocations for instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.703 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.703 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.796 2 DEBUG nova.compute.provider_tree [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.809 2 DEBUG nova.scheduler.client.report [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.836 2 DEBUG nova.compute.manager [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.836 2 DEBUG nova.compute.manager [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing instance network info cache due to event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.837 2 DEBUG oslo_concurrency.lockutils [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.837 2 DEBUG oslo_concurrency.lockutils [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.837 2 DEBUG nova.network.neutron [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.845 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:46 compute-0 nova_compute[192810]: 2025-09-30 21:52:46.918 2 DEBUG oslo_concurrency.lockutils [None req-d65e2564-1053-4e31-bbb2-c70becf1e421 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 12.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:47 compute-0 unix_chkpwd[249256]: password check failed for user (root)
Sep 30 21:52:47 compute-0 sshd-session[249254]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:52:48 compute-0 nova_compute[192810]: 2025-09-30 21:52:48.148 2 DEBUG nova.network.neutron [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated VIF entry in instance network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:48 compute-0 nova_compute[192810]: 2025-09-30 21:52:48.148 2 DEBUG nova.network.neutron [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": null, "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap6f02d077-99", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:48 compute-0 nova_compute[192810]: 2025-09-30 21:52:48.173 2 DEBUG oslo_concurrency.lockutils [req-90052e6e-94c0-4b85-993d-2ea0bc1debb8 req-879d5a47-e074-48dc-ad9f-053523ab441c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:48 compute-0 nova_compute[192810]: 2025-09-30 21:52:48.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:49 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:52:50 compute-0 nova_compute[192810]: 2025-09-30 21:52:50.021 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269155.019908, 06072d75-591c-4422-8b92-2176427d6b4d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:50 compute-0 nova_compute[192810]: 2025-09-30 21:52:50.022 2 INFO nova.compute.manager [-] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] VM Stopped (Lifecycle Event)
Sep 30 21:52:50 compute-0 nova_compute[192810]: 2025-09-30 21:52:50.043 2 DEBUG nova.compute.manager [None req-76c62744-3f5b-4ea7-b940-9fb962e2ac30 - - - - - -] [instance: 06072d75-591c-4422-8b92-2176427d6b4d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:51 compute-0 nova_compute[192810]: 2025-09-30 21:52:51.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:51 compute-0 unix_chkpwd[249257]: password check failed for user (root)
Sep 30 21:52:51 compute-0 nova_compute[192810]: 2025-09-30 21:52:51.818 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:51 compute-0 nova_compute[192810]: 2025-09-30 21:52:51.818 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:51 compute-0 nova_compute[192810]: 2025-09-30 21:52:51.819 2 INFO nova.compute.manager [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Unshelving
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.122 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.122 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.125 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'pci_requests' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.136 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'numa_topology' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.147 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.147 2 INFO nova.compute.claims [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.272 2 DEBUG nova.compute.provider_tree [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.296 2 DEBUG nova.scheduler.client.report [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.313 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.543 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269157.541967, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.543 2 INFO nova.compute.manager [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Stopped (Lifecycle Event)
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.583 2 DEBUG nova.compute.manager [None req-f70bfd23-fae2-4200-91cb-8474f6371c30 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:52:52 compute-0 nova_compute[192810]: 2025-09-30 21:52:52.982 2 INFO nova.network.neutron [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating port 6f02d077-990b-41dc-87e5-d4dfb5b0397d with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 21:52:53 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:52:53 compute-0 nova_compute[192810]: 2025-09-30 21:52:53.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:53 compute-0 unix_chkpwd[249258]: password check failed for user (root)
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.045 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.045 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.046 2 DEBUG nova.network.neutron [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.167 2 DEBUG nova.compute.manager [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.167 2 DEBUG nova.compute.manager [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing instance network info cache due to event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:52:54 compute-0 nova_compute[192810]: 2025-09-30 21:52:54.168 2 DEBUG oslo_concurrency.lockutils [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:52:55 compute-0 podman[249260]: 2025-09-30 21:52:55.320329361 +0000 UTC m=+0.053530870 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 21:52:55 compute-0 podman[249261]: 2025-09-30 21:52:55.327009006 +0000 UTC m=+0.053693164 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:52:55 compute-0 podman[249259]: 2025-09-30 21:52:55.344984723 +0000 UTC m=+0.078171202 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.557 2 DEBUG nova.network.neutron [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.585 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.587 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.588 2 INFO nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating image(s)
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.589 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.589 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.590 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.591 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.592 2 DEBUG oslo_concurrency.lockutils [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.593 2 DEBUG nova.network.neutron [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.608 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:55 compute-0 nova_compute[192810]: 2025-09-30 21:52:55.610 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:55 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:52:56 compute-0 nova_compute[192810]: 2025-09-30 21:52:56.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:57 compute-0 nova_compute[192810]: 2025-09-30 21:52:57.345 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:57 compute-0 nova_compute[192810]: 2025-09-30 21:52:57.437 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.part --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:57 compute-0 nova_compute[192810]: 2025-09-30 21:52:57.439 2 DEBUG nova.virt.images [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] a10325dd-c3a0-484c-b481-f0815c9caaa6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:52:57 compute-0 nova_compute[192810]: 2025-09-30 21:52:57.451 2 DEBUG nova.privsep.utils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:52:57 compute-0 nova_compute[192810]: 2025-09-30 21:52:57.452 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.part /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:57 compute-0 unix_chkpwd[249332]: password check failed for user (root)
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.409 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.part /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.converted" returned: 0 in 0.957s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.421 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.485 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6.converted --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.487 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.500 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.555 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.556 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.557 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.568 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.624 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.625 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6,backing_fmt=raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.665 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6,backing_fmt=raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.666 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "9b4cc05c9bca2c840c39239e394bcf52c549dde6" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.667 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.718 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/9b4cc05c9bca2c840c39239e394bcf52c549dde6 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.719 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'migration_context' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.762 2 INFO nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Rebasing disk image.
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.762 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.815 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:52:58 compute-0 nova_compute[192810]: 2025-09-30 21:52:58.816 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:52:59 compute-0 nova_compute[192810]: 2025-09-30 21:52:59.081 2 DEBUG nova.network.neutron [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated VIF entry in instance network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:52:59 compute-0 nova_compute[192810]: 2025-09-30 21:52:59.083 2 DEBUG nova.network.neutron [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:52:59 compute-0 nova_compute[192810]: 2025-09-30 21:52:59.106 2 DEBUG oslo_concurrency.lockutils [req-83018a2a-d0f3-406d-bed1-e4b51bc081dd req-0a7649fa-a43f-4bc7-bb64-294cb90b97a9 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:00 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.157 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a -F raw /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk" returned: 0 in 1.341s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.158 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.158 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Ensure instance console log exists: /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.158 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.159 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.159 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.161 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start _get_guest_xml network_info=[{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='5b30e7641766e739fe924e5bcac37b68',container_format='bare',created_at=2025-09-30T21:52:34Z,direct_url=<?>,disk_format='qcow2',id=a10325dd-c3a0-484c-b481-f0815c9caaa6,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-144130882-shelved',owner='8f69d27778a54a45a6527a19643d1fe0',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-09-30T21:52:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.164 2 WARNING nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.168 2 DEBUG nova.virt.libvirt.host [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.168 2 DEBUG nova.virt.libvirt.host [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.171 2 DEBUG nova.virt.libvirt.host [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.171 2 DEBUG nova.virt.libvirt.host [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.172 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.173 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='5b30e7641766e739fe924e5bcac37b68',container_format='bare',created_at=2025-09-30T21:52:34Z,direct_url=<?>,disk_format='qcow2',id=a10325dd-c3a0-484c-b481-f0815c9caaa6,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-144130882-shelved',owner='8f69d27778a54a45a6527a19643d1fe0',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2025-09-30T21:52:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.173 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.173 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.174 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.174 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.174 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.174 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.174 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.175 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.175 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.175 2 DEBUG nova.virt.hardware [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.175 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.213 2 DEBUG nova.virt.libvirt.vif [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='a10325dd-c3a0-484c-b481-f0815c9caaa6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1916761761',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member',shelved_at='2025-09-30T21:52:42.035433',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='a10325dd-c3a0-484c-b481-f0815c9caaa6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:52Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.214 2 DEBUG nova.network.os_vif_util [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.214 2 DEBUG nova.network.os_vif_util [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.215 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'pci_devices' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.235 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <uuid>f346a37a-3dd5-4a2b-a445-9a0fe47a9194</uuid>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <name>instance-000000b1</name>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:name>tempest-TestShelveInstance-server-144130882</nova:name>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:53:00</nova:creationTime>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:user uuid="2a25e9b373084bfa9bc194a299a3ac4a">tempest-TestShelveInstance-1185384137-project-member</nova:user>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:project uuid="8f69d27778a54a45a6527a19643d1fe0">tempest-TestShelveInstance-1185384137</nova:project>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="a10325dd-c3a0-484c-b481-f0815c9caaa6"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         <nova:port uuid="6f02d077-990b-41dc-87e5-d4dfb5b0397d">
Sep 30 21:53:00 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <system>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="serial">f346a37a-3dd5-4a2b-a445-9a0fe47a9194</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="uuid">f346a37a-3dd5-4a2b-a445-9a0fe47a9194</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </system>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <os>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </os>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <features>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </features>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:ee:93:c6"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <target dev="tap6f02d077-99"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/console.log" append="off"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <video>
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </video>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:53:00 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:53:00 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:53:00 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:53:00 compute-0 nova_compute[192810]: </domain>
Sep 30 21:53:00 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.237 2 DEBUG nova.compute.manager [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Preparing to wait for external event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.237 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.237 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.237 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.238 2 DEBUG nova.virt.libvirt.vif [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='a10325dd-c3a0-484c-b481-f0815c9caaa6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1916761761',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:52:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member',shelved_at='2025-09-30T21:52:42.035433',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='a10325dd-c3a0-484c-b481-f0815c9caaa6'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:52:52Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.238 2 DEBUG nova.network.os_vif_util [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.239 2 DEBUG nova.network.os_vif_util [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.239 2 DEBUG os_vif [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.240 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.241 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.243 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6f02d077-99, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.243 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6f02d077-99, col_values=(('external_ids', {'iface-id': '6f02d077-990b-41dc-87e5-d4dfb5b0397d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ee:93:c6', 'vm-uuid': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-0 NetworkManager[51733]: <info>  [1759269180.2461] manager: (tap6f02d077-99): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/317)
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.251 2 INFO os_vif [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99')
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.316 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.317 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.317 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] No VIF found with MAC fa:16:3e:ee:93:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.318 2 INFO nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Using config drive
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.339 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.416 2 DEBUG nova.objects.instance [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'keypairs' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.756 2 INFO nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Creating config drive at /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.761 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmg8sypvt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.889 2 DEBUG oslo_concurrency.processutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmg8sypvt" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:00 compute-0 kernel: tap6f02d077-99: entered promiscuous mode
Sep 30 21:53:00 compute-0 NetworkManager[51733]: <info>  [1759269180.9487] manager: (tap6f02d077-99): new Tun device (/org/freedesktop/NetworkManager/Devices/318)
Sep 30 21:53:00 compute-0 ovn_controller[94912]: 2025-09-30T21:53:00Z|00723|binding|INFO|Claiming lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d for this chassis.
Sep 30 21:53:00 compute-0 ovn_controller[94912]: 2025-09-30T21:53:00Z|00724|binding|INFO|6f02d077-990b-41dc-87e5-d4dfb5b0397d: Claiming fa:16:3e:ee:93:c6 10.100.0.13
Sep 30 21:53:00 compute-0 systemd-udevd[249372]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:00 compute-0 nova_compute[192810]: 2025-09-30 21:53:00.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.0053] device (tap6f02d077-99): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.0064] device (tap6f02d077-99): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.0135] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.0142] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.022 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:93:c6 10.100.0.13'], port_security=['fa:16:3e:ee:93:c6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f69d27778a54a45a6527a19643d1fe0', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'a1f2b095-bc71-4c21-8e62-a47f9c684104', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.217'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cee6e55-2dcb-411d-b86b-9118cc3f4c81, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6f02d077-990b-41dc-87e5-d4dfb5b0397d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.023 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6f02d077-990b-41dc-87e5-d4dfb5b0397d in datapath 37abdced-e5ee-484f-bdda-8b1a7b19a7f0 bound to our chassis
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.025 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.035 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f43e49a1-ed14-4e91-b170-750532cf846d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.036 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap37abdced-e1 in ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.038 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap37abdced-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.038 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[866897a1-6efd-4d8e-a041-65064d270286]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.039 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9374f4-a5d1-484d-80fb-49e43934d0b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.049 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0f59ac-07d9-4780-a336-08be919e8cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.076 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4b7e14-6ea8-4544-b93d-bfdf2eb9e464]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.106 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1a9ee05d-616f-4e2f-adf2-9a2804c8b0aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 systemd-machined[152794]: New machine qemu-87-instance-000000b1.
Sep 30 21:53:01 compute-0 systemd-udevd[249374]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.1210] manager: (tap37abdced-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/321)
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.122 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[87ba5f9f-f670-437c-9ab2-794ed6ba69cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.154 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[c94ec9f3-84b4-470b-824c-85d4862818c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.157 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a6de6591-f5da-46ad-bb32-ef30b8332a11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 systemd[1]: Started Virtual Machine qemu-87-instance-000000b1.
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.1769] device (tap37abdced-e0): carrier: link connected
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.182 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b43b67-9ade-4683-9593-6c1ef8806204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.199 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef95821-35a1-4086-a730-b46db2a6cfc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37abdced-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:17:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587680, 'reachable_time': 19414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249403, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.212 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bba6be-6373-4a5e-9c79-04cee3eb081d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe89:1739'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587680, 'tstamp': 587680}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249408, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.228 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5996f4fa-75a3-4c8e-97cf-c35475e2e5c0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap37abdced-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:89:17:39'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 218], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587680, 'reachable_time': 19414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249409, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.258 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e73c2e12-4c55-4cc3-a4c1-0e6cb853e445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_controller[94912]: 2025-09-30T21:53:01Z|00725|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d ovn-installed in OVS
Sep 30 21:53:01 compute-0 ovn_controller[94912]: 2025-09-30T21:53:01Z|00726|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d up in Southbound
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.313 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2bb85742-4320-4b91-b13e-c93eed4b14e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37abdced-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.315 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37abdced-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 kernel: tap37abdced-e0: entered promiscuous mode
Sep 30 21:53:01 compute-0 NetworkManager[51733]: <info>  [1759269181.3178] manager: (tap37abdced-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/322)
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.319 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap37abdced-e0, col_values=(('external_ids', {'iface-id': 'cf7ed38e-459c-4152-addb-db64052a2be8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_controller[94912]: 2025-09-30T21:53:01Z|00727|binding|INFO|Releasing lport cf7ed38e-459c-4152-addb-db64052a2be8 from this chassis (sb_readonly=0)
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.333 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.334 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7403c25c-227e-40c3-8e3c-3b42b86a4bbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.335 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.pid.haproxy
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 37abdced-e5ee-484f-bdda-8b1a7b19a7f0
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.335 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'env', 'PROCESS_TAG=haproxy-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/37abdced-e5ee-484f-bdda-8b1a7b19a7f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:53:01 compute-0 unix_chkpwd[249434]: password check failed for user (root)
Sep 30 21:53:01 compute-0 podman[249440]: 2025-09-30 21:53:01.762428125 +0000 UTC m=+0.100325412 container create dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Sep 30 21:53:01 compute-0 podman[249440]: 2025-09-30 21:53:01.684708135 +0000 UTC m=+0.022605452 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:53:01 compute-0 nova_compute[192810]: 2025-09-30 21:53:01.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:01 compute-0 systemd[1]: Started libpod-conmon-dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516.scope.
Sep 30 21:53:01 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:53:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4721a1d5ecd66bb2ca025f5187d7463b1e20137275fb21032b9df1b8861d088a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:53:01 compute-0 podman[249440]: 2025-09-30 21:53:01.866055388 +0000 UTC m=+0.203952695 container init dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:01 compute-0 podman[249440]: 2025-09-30 21:53:01.871302818 +0000 UTC m=+0.209200095 container start dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:53:01 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [NOTICE]   (249466) : New worker (249468) forked
Sep 30 21:53:01 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [NOTICE]   (249466) : Loading success.
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.927 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8::f816:3eff:fea2:758c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7661d9ad-20f6-48e0-8cdf-e095331fbd29) old=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.951 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7661d9ad-20f6-48e0-8cdf-e095331fbd29 in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 updated
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.954 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59f11ff9-50c1-45e8-ac0d-a61faf820997, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:01.955 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[523de3d7-d6fd-4374-9411-8b0cbbbc3e41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.025 2 DEBUG nova.compute.manager [req-558276c5-4bbe-4cb7-b016-ccfa22ce105c req-629e3445-2c66-45ad-aece-432fd1da0010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.026 2 DEBUG oslo_concurrency.lockutils [req-558276c5-4bbe-4cb7-b016-ccfa22ce105c req-629e3445-2c66-45ad-aece-432fd1da0010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.026 2 DEBUG oslo_concurrency.lockutils [req-558276c5-4bbe-4cb7-b016-ccfa22ce105c req-629e3445-2c66-45ad-aece-432fd1da0010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.027 2 DEBUG oslo_concurrency.lockutils [req-558276c5-4bbe-4cb7-b016-ccfa22ce105c req-629e3445-2c66-45ad-aece-432fd1da0010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.027 2 DEBUG nova.compute.manager [req-558276c5-4bbe-4cb7-b016-ccfa22ce105c req-629e3445-2c66-45ad-aece-432fd1da0010 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Processing event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.169 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269182.168975, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.170 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Started (Lifecycle Event)
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.173 2 DEBUG nova.compute.manager [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.176 2 DEBUG nova.virt.libvirt.driver [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.179 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance spawned successfully.
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.193 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.197 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.228 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.229 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269182.1690843, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.229 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Paused (Lifecycle Event)
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.257 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.261 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269182.1754904, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.261 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Resumed (Lifecycle Event)
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.288 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.291 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.318 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:53:02 compute-0 nova_compute[192810]: 2025-09-30 21:53:02.944 2 DEBUG nova.compute.manager [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:03 compute-0 nova_compute[192810]: 2025-09-30 21:53:03.051 2 DEBUG oslo_concurrency.lockutils [None req-032f6a14-f67f-4946-9f97-59a1d5c03eec 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 11.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:03 compute-0 nova_compute[192810]: 2025-09-30 21:53:03.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:03 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:53:03 compute-0 nova_compute[192810]: 2025-09-30 21:53:03.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:03 compute-0 nova_compute[192810]: 2025-09-30 21:53:03.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:03 compute-0 nova_compute[192810]: 2025-09-30 21:53:03.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.177 2 DEBUG nova.compute.manager [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.178 2 DEBUG oslo_concurrency.lockutils [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.179 2 DEBUG oslo_concurrency.lockutils [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.179 2 DEBUG oslo_concurrency.lockutils [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.180 2 DEBUG nova.compute.manager [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:04 compute-0 nova_compute[192810]: 2025-09-30 21:53:04.180 2 WARNING nova.compute.manager [req-42bb024c-3580-4956-92a3-0ed19e568781 req-6f994981-b55a-43e1-b6ba-6907a35d57ee dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received unexpected event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with vm_state active and task_state None.
Sep 30 21:53:05 compute-0 nova_compute[192810]: 2025-09-30 21:53:05.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:05 compute-0 podman[249478]: 2025-09-30 21:53:05.32835365 +0000 UTC m=+0.065050356 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:53:05 compute-0 podman[249479]: 2025-09-30 21:53:05.356025927 +0000 UTC m=+0.092323693 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter)
Sep 30 21:53:05 compute-0 unix_chkpwd[249521]: password check failed for user (root)
Sep 30 21:53:05 compute-0 nova_compute[192810]: 2025-09-30 21:53:05.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:07 compute-0 sshd-session[249254]: Failed password for root from 8.210.178.40 port 58946 ssh2
Sep 30 21:53:07 compute-0 sshd-session[249254]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 58946 ssh2 [preauth]
Sep 30 21:53:07 compute-0 sshd-session[249254]: Disconnecting authenticating user root 8.210.178.40 port 58946: Too many authentication failures [preauth]
Sep 30 21:53:07 compute-0 sshd-session[249254]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:07 compute-0 sshd-session[249254]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:53:08 compute-0 nova_compute[192810]: 2025-09-30 21:53:08.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:09.413 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8:0:1:f816:3eff:fea2:758c 2001:db8::f816:3eff:fea2:758c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fea2:758c/64 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7661d9ad-20f6-48e0-8cdf-e095331fbd29) old=Port_Binding(mac=['fa:16:3e:a2:75:8c 10.100.0.2 2001:db8::f816:3eff:fea2:758c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea2:758c/64', 'neutron:device_id': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:09.415 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7661d9ad-20f6-48e0-8cdf-e095331fbd29 in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 updated
Sep 30 21:53:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:09.418 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59f11ff9-50c1-45e8-ac0d-a61faf820997, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:09 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:09.425 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6d69a4ad-c102-4383-bc57-24aebf7b3f9a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:09 compute-0 unix_chkpwd[249524]: password check failed for user (root)
Sep 30 21:53:09 compute-0 sshd-session[249522]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.989 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.989 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:53:10 compute-0 nova_compute[192810]: 2025-09-30 21:53:10.989 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:11 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:12 compute-0 podman[249527]: 2025-09-30 21:53:12.338639635 +0000 UTC m=+0.058959316 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:53:12 compute-0 podman[249525]: 2025-09-30 21:53:12.351668868 +0000 UTC m=+0.080580722 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 21:53:12 compute-0 podman[249526]: 2025-09-30 21:53:12.356522959 +0000 UTC m=+0.073813735 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:53:13 compute-0 unix_chkpwd[249586]: password check failed for user (root)
Sep 30 21:53:13 compute-0 nova_compute[192810]: 2025-09-30 21:53:13.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.079 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.114 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.115 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.117 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.117 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:14 compute-0 nova_compute[192810]: 2025-09-30 21:53:14.118 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:15 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.812 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.813 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.814 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.884 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.950 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:15 compute-0 nova_compute[192810]: 2025-09-30 21:53:15.951 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.015 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:16 compute-0 unix_chkpwd[249601]: password check failed for user (root)
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.171 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.173 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5593MB free_disk=73.13264465332031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.173 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.174 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.249 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.250 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.250 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.287 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.304 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.374 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:53:16 compute-0 nova_compute[192810]: 2025-09-30 21:53:16.375 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:16 compute-0 ovn_controller[94912]: 2025-09-30T21:53:16Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ee:93:c6 10.100.0.13
Sep 30 21:53:18 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:18 compute-0 nova_compute[192810]: 2025-09-30 21:53:18.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:20 compute-0 unix_chkpwd[249602]: password check failed for user (root)
Sep 30 21:53:20 compute-0 nova_compute[192810]: 2025-09-30 21:53:20.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:20 compute-0 ovn_controller[94912]: 2025-09-30T21:53:20Z|00728|binding|INFO|Releasing lport cf7ed38e-459c-4152-addb-db64052a2be8 from this chassis (sb_readonly=0)
Sep 30 21:53:20 compute-0 nova_compute[192810]: 2025-09-30 21:53:20.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:21 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:22 compute-0 unix_chkpwd[249603]: password check failed for user (root)
Sep 30 21:53:23 compute-0 nova_compute[192810]: 2025-09-30 21:53:23.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:24 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:24 compute-0 ovn_controller[94912]: 2025-09-30T21:53:24Z|00729|binding|INFO|Releasing lport cf7ed38e-459c-4152-addb-db64052a2be8 from this chassis (sb_readonly=0)
Sep 30 21:53:24 compute-0 nova_compute[192810]: 2025-09-30 21:53:24.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:25 compute-0 nova_compute[192810]: 2025-09-30 21:53:25.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:26 compute-0 unix_chkpwd[249604]: password check failed for user (root)
Sep 30 21:53:26 compute-0 podman[249607]: 2025-09-30 21:53:26.362644495 +0000 UTC m=+0.082793477 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Sep 30 21:53:26 compute-0 podman[249606]: 2025-09-30 21:53:26.364260805 +0000 UTC m=+0.088401196 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:53:26 compute-0 podman[249605]: 2025-09-30 21:53:26.399360436 +0000 UTC m=+0.131375372 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 21:53:27 compute-0 sshd-session[249522]: Failed password for root from 8.210.178.40 port 59580 ssh2
Sep 30 21:53:27 compute-0 sshd-session[249522]: error: maximum authentication attempts exceeded for root from 8.210.178.40 port 59580 ssh2 [preauth]
Sep 30 21:53:27 compute-0 sshd-session[249522]: Disconnecting authenticating user root 8.210.178.40 port 59580: Too many authentication failures [preauth]
Sep 30 21:53:27 compute-0 sshd-session[249522]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:27 compute-0 sshd-session[249522]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:53:28 compute-0 nova_compute[192810]: 2025-09-30 21:53:28.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:29 compute-0 unix_chkpwd[249669]: password check failed for user (root)
Sep 30 21:53:29 compute-0 sshd-session[249667]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:30.090 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:30 compute-0 nova_compute[192810]: 2025-09-30 21:53:30.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:30.091 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:53:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:30.092 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:30 compute-0 nova_compute[192810]: 2025-09-30 21:53:30.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:31 compute-0 sshd-session[249667]: Failed password for root from 8.210.178.40 port 60296 ssh2
Sep 30 21:53:31 compute-0 unix_chkpwd[249670]: password check failed for user (root)
Sep 30 21:53:32 compute-0 sshd-session[249667]: Failed password for root from 8.210.178.40 port 60296 ssh2
Sep 30 21:53:33 compute-0 nova_compute[192810]: 2025-09-30 21:53:33.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:33 compute-0 nova_compute[192810]: 2025-09-30 21:53:33.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:34 compute-0 sshd-session[249667]: Disconnecting authenticating user root 8.210.178.40 port 60296: Change of username or service not allowed: (root,ssh-connection) -> (admin,ssh-connection) [preauth]
Sep 30 21:53:34 compute-0 sshd-session[249667]: PAM 1 more authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40  user=root
Sep 30 21:53:35 compute-0 nova_compute[192810]: 2025-09-30 21:53:35.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:36 compute-0 sshd-session[249671]: Invalid user admin from 8.210.178.40 port 60572
Sep 30 21:53:36 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:36 compute-0 sshd-session[249671]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:36 compute-0 podman[249673]: 2025-09-30 21:53:36.251545037 +0000 UTC m=+0.056589726 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:53:36 compute-0 podman[249674]: 2025-09-30 21:53:36.273774369 +0000 UTC m=+0.063749484 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., release=1755695350, version=9.6)
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.535 2 DEBUG nova.compute.manager [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.536 2 DEBUG nova.compute.manager [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing instance network info cache due to event network-changed-6f02d077-990b-41dc-87e5-d4dfb5b0397d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.536 2 DEBUG oslo_concurrency.lockutils [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.536 2 DEBUG oslo_concurrency.lockutils [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.536 2 DEBUG nova.network.neutron [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Refreshing network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.663 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.664 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.664 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.665 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.665 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.691 2 INFO nova.compute.manager [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Terminating instance
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.704 2 DEBUG nova.compute.manager [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:53:37 compute-0 kernel: tap6f02d077-99 (unregistering): left promiscuous mode
Sep 30 21:53:37 compute-0 NetworkManager[51733]: <info>  [1759269217.7319] device (tap6f02d077-99): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:53:37 compute-0 ovn_controller[94912]: 2025-09-30T21:53:37Z|00730|binding|INFO|Releasing lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d from this chassis (sb_readonly=0)
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:37 compute-0 ovn_controller[94912]: 2025-09-30T21:53:37Z|00731|binding|INFO|Setting lport 6f02d077-990b-41dc-87e5-d4dfb5b0397d down in Southbound
Sep 30 21:53:37 compute-0 ovn_controller[94912]: 2025-09-30T21:53:37Z|00732|binding|INFO|Removing iface tap6f02d077-99 ovn-installed in OVS
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:37.752 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ee:93:c6 10.100.0.13'], port_security=['fa:16:3e:ee:93:c6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'f346a37a-3dd5-4a2b-a445-9a0fe47a9194', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8f69d27778a54a45a6527a19643d1fe0', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'a1f2b095-bc71-4c21-8e62-a47f9c684104', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0cee6e55-2dcb-411d-b86b-9118cc3f4c81, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=6f02d077-990b-41dc-87e5-d4dfb5b0397d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:53:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:37.754 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 6f02d077-990b-41dc-87e5-d4dfb5b0397d in datapath 37abdced-e5ee-484f-bdda-8b1a7b19a7f0 unbound from our chassis
Sep 30 21:53:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:37.755 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37abdced-e5ee-484f-bdda-8b1a7b19a7f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:37.758 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[c753ad88-212c-493b-83f8-29c7330841cd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:37.759 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 namespace which is not needed anymore
Sep 30 21:53:37 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Sep 30 21:53:37 compute-0 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b1.scope: Consumed 15.185s CPU time.
Sep 30 21:53:37 compute-0 systemd-machined[152794]: Machine qemu-87-instance-000000b1 terminated.
Sep 30 21:53:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [NOTICE]   (249466) : haproxy version is 2.8.14-c23fe91
Sep 30 21:53:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [NOTICE]   (249466) : path to executable is /usr/sbin/haproxy
Sep 30 21:53:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [WARNING]  (249466) : Exiting Master process...
Sep 30 21:53:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [ALERT]    (249466) : Current worker (249468) exited with code 143 (Terminated)
Sep 30 21:53:37 compute-0 neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0[249462]: [WARNING]  (249466) : All workers exited. Exiting... (0)
Sep 30 21:53:37 compute-0 systemd[1]: libpod-dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516.scope: Deactivated successfully.
Sep 30 21:53:37 compute-0 podman[249741]: 2025-09-30 21:53:37.924677959 +0000 UTC m=+0.050533116 container died dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:53:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516-userdata-shm.mount: Deactivated successfully.
Sep 30 21:53:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-4721a1d5ecd66bb2ca025f5187d7463b1e20137275fb21032b9df1b8861d088a-merged.mount: Deactivated successfully.
Sep 30 21:53:37 compute-0 podman[249741]: 2025-09-30 21:53:37.970927747 +0000 UTC m=+0.096782864 container cleanup dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923)
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.972 2 INFO nova.virt.libvirt.driver [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Instance destroyed successfully.
Sep 30 21:53:37 compute-0 nova_compute[192810]: 2025-09-30 21:53:37.973 2 DEBUG nova.objects.instance [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lazy-loading 'resources' on Instance uuid f346a37a-3dd5-4a2b-a445-9a0fe47a9194 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:37 compute-0 systemd[1]: libpod-conmon-dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516.scope: Deactivated successfully.
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.006 2 DEBUG nova.virt.libvirt.vif [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-09-30T21:52:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-144130882',display_name='tempest-TestShelveInstance-server-144130882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-144130882',id=177,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLqJH0XUHnV+DFJ4EEdvUcnnJtAN3d0qXBSayXC/1r7r1r6VLAmC4Ityxcv3hObBaxsXAuhiLyJJeNSDzDDKD8CyDvitwtpXo1jlbjfL2MJDye78MFsuQz3G/vPFK9b4zw==',key_name='tempest-TestShelveInstance-1916761761',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:53:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8f69d27778a54a45a6527a19643d1fe0',ramdisk_id='',reservation_id='r-u088hjgo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-1185384137',owner_user_name='tempest-TestShelveInstance-1185384137-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:53:02Z,user_data=None,user_id='2a25e9b373084bfa9bc194a299a3ac4a',uuid=f346a37a-3dd5-4a2b-a445-9a0fe47a9194,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.007 2 DEBUG nova.network.os_vif_util [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converting VIF {"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.008 2 DEBUG nova.network.os_vif_util [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.009 2 DEBUG os_vif [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.013 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6f02d077-99, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.022 2 INFO os_vif [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ee:93:c6,bridge_name='br-int',has_traffic_filtering=True,id=6f02d077-990b-41dc-87e5-d4dfb5b0397d,network=Network(37abdced-e5ee-484f-bdda-8b1a7b19a7f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6f02d077-99')
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.023 2 INFO nova.virt.libvirt.driver [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Deleting instance files /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194_del
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.031 2 INFO nova.virt.libvirt.driver [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Deletion of /var/lib/nova/instances/f346a37a-3dd5-4a2b-a445-9a0fe47a9194_del complete
Sep 30 21:53:38 compute-0 podman[249789]: 2025-09-30 21:53:38.048674347 +0000 UTC m=+0.051201282 container remove dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.052 2 DEBUG nova.compute.manager [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.052 2 DEBUG oslo_concurrency.lockutils [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.053 2 DEBUG oslo_concurrency.lockutils [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.053 2 DEBUG oslo_concurrency.lockutils [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.053 2 DEBUG nova.compute.manager [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.053 2 DEBUG nova.compute.manager [req-4c9bba85-811e-4984-87e1-11eba92e11f8 req-7c3e0aac-50bf-4c09-a908-17fab20b522b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-unplugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.056 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2cb436-184b-4935-9c62-8e4b573a3db2]: (4, ('Tue Sep 30 09:53:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 (dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516)\ndd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516\nTue Sep 30 09:53:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 (dd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516)\ndd482c1c5f5585873f59e091b565878b46e1f626f399de4e7167b16c76110516\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.059 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[07847ae5-9892-4066-ad27-127752bbc976]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.062 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37abdced-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 kernel: tap37abdced-e0: left promiscuous mode
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.078 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6d966bb1-c1f6-421d-9dce-d3f3bdb5e1c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.118 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[92d1e2c4-d553-4d83-a47c-223646e4e380]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.119 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4305f119-3abc-48f1-aac3-000e9d11e83a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.131 2 INFO nova.compute.manager [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Took 0.43 seconds to destroy the instance on the hypervisor.
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.131 2 DEBUG oslo.service.loopingcall [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.132 2 DEBUG nova.compute.manager [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.132 2 DEBUG nova.network.neutron [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.139 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4bc56751-561f-46ff-a59f-4750baf9d7e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587672, 'reachable_time': 36626, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249805, 'error': None, 'target': 'ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.143 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-37abdced-e5ee-484f-bdda-8b1a7b19a7f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.143 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[829e55a1-cbcf-4a23-bde7-70f47425a125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:53:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d37abdced\x2de5ee\x2d484f\x2dbdda\x2d8b1a7b19a7f0.mount: Deactivated successfully.
Sep 30 21:53:38 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 nova_compute[192810]: 2025-09-30 21:53:38.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.759 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.759 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:53:38.760 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:39 compute-0 nova_compute[192810]: 2025-09-30 21:53:39.208 2 DEBUG nova.network.neutron [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:39 compute-0 nova_compute[192810]: 2025-09-30 21:53:39.284 2 DEBUG nova.network.neutron [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updated VIF entry in instance network info cache for port 6f02d077-990b-41dc-87e5-d4dfb5b0397d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:53:39 compute-0 nova_compute[192810]: 2025-09-30 21:53:39.285 2 DEBUG nova.network.neutron [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [{"id": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "address": "fa:16:3e:ee:93:c6", "network": {"id": "37abdced-e5ee-484f-bdda-8b1a7b19a7f0", "bridge": "br-int", "label": "tempest-TestShelveInstance-1740163448-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8f69d27778a54a45a6527a19643d1fe0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6f02d077-99", "ovs_interfaceid": "6f02d077-990b-41dc-87e5-d4dfb5b0397d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:39 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:39 compute-0 nova_compute[192810]: 2025-09-30 21:53:39.910 2 INFO nova.compute.manager [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Took 1.78 seconds to deallocate network for instance.
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.118 2 DEBUG nova.compute.manager [req-e774573f-5024-497d-8534-c667cc11a6e1 req-d2580565-bd36-4101-b52d-760f7d2594a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-deleted-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.118 2 INFO nova.compute.manager [req-e774573f-5024-497d-8534-c667cc11a6e1 req-d2580565-bd36-4101-b52d-760f7d2594a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Neutron deleted interface 6f02d077-990b-41dc-87e5-d4dfb5b0397d; detaching it from the instance and deleting it from the info cache
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.119 2 DEBUG nova.network.neutron [req-e774573f-5024-497d-8534-c667cc11a6e1 req-d2580565-bd36-4101-b52d-760f7d2594a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.205 2 DEBUG oslo_concurrency.lockutils [req-4b5a9d18-9b8f-4426-b3af-aa1dee3f8f7f req-5a03e5c5-e942-40fe-a90f-58393991ca80 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-f346a37a-3dd5-4a2b-a445-9a0fe47a9194" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.220 2 DEBUG nova.compute.manager [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.221 2 DEBUG oslo_concurrency.lockutils [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.221 2 DEBUG oslo_concurrency.lockutils [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.222 2 DEBUG oslo_concurrency.lockutils [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.222 2 DEBUG nova.compute.manager [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] No waiting events found dispatching network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.222 2 WARNING nova.compute.manager [req-8fbd518f-340f-403c-bde7-363de63961c7 req-1c402d00-fcb2-4c08-843a-e6296f8a4ffd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Received unexpected event network-vif-plugged-6f02d077-990b-41dc-87e5-d4dfb5b0397d for instance with vm_state active and task_state deleting.
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.327 2 DEBUG nova.compute.manager [req-e774573f-5024-497d-8534-c667cc11a6e1 req-d2580565-bd36-4101-b52d-760f7d2594a6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Detach interface failed, port_id=6f02d077-990b-41dc-87e5-d4dfb5b0397d, reason: Instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.400 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.401 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.462 2 DEBUG nova.compute.provider_tree [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.530 2 DEBUG nova.scheduler.client.report [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.597 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.655 2 INFO nova.scheduler.client.report [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Deleted allocations for instance f346a37a-3dd5-4a2b-a445-9a0fe47a9194
Sep 30 21:53:40 compute-0 nova_compute[192810]: 2025-09-30 21:53:40.943 2 DEBUG oslo_concurrency.lockutils [None req-4b0a7c2f-a208-4c20-ad42-3a5463d1326a 2a25e9b373084bfa9bc194a299a3ac4a 8f69d27778a54a45a6527a19643d1fe0 - - default default] Lock "f346a37a-3dd5-4a2b-a445-9a0fe47a9194" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:41 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:43 compute-0 nova_compute[192810]: 2025-09-30 21:53:43.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:43 compute-0 podman[249807]: 2025-09-30 21:53:43.319472931 +0000 UTC m=+0.054042082 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 21:53:43 compute-0 podman[249809]: 2025-09-30 21:53:43.319575894 +0000 UTC m=+0.048630428 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:53:43 compute-0 podman[249808]: 2025-09-30 21:53:43.327416938 +0000 UTC m=+0.060207955 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 21:53:43 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:43 compute-0 nova_compute[192810]: 2025-09-30 21:53:43.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:45 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:47 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:47 compute-0 nova_compute[192810]: 2025-09-30 21:53:47.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-0 nova_compute[192810]: 2025-09-30 21:53:48.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-0 nova_compute[192810]: 2025-09-30 21:53:48.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:48 compute-0 nova_compute[192810]: 2025-09-30 21:53:48.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:49 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.571 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.572 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.598 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.755 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.756 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.765 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.766 2 INFO nova.compute.claims [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:53:50 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.901 2 DEBUG nova.compute.provider_tree [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.915 2 DEBUG nova.scheduler.client.report [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.938 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:50 compute-0 nova_compute[192810]: 2025-09-30 21:53:50.939 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.000 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.001 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.018 2 INFO nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.047 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.212 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.213 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.214 2 INFO nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Creating image(s)
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.214 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.215 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.215 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.233 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.326 2 DEBUG nova.policy [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5ffd1d7824fe413499994bd48b9f820f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.329 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.330 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.330 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.341 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.409 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.411 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.612 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk 1073741824" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.613 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.613 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.673 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.674 2 DEBUG nova.virt.disk.api [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Checking if we can resize image /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.675 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.770 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.771 2 DEBUG nova.virt.disk.api [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Cannot resize image /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:53:51 compute-0 nova_compute[192810]: 2025-09-30 21:53:51.771 2 DEBUG nova.objects.instance [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'migration_context' on Instance uuid c5bbd76f-27d7-4aea-9212-994ecc27dbe5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.047 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.048 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Ensure instance console log exists: /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.048 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.048 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.049 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:52 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:52 compute-0 sshd-session[249671]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.971 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269217.9691591, f346a37a-3dd5-4a2b-a445-9a0fe47a9194 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:53:52 compute-0 nova_compute[192810]: 2025-09-30 21:53:52.972 2 INFO nova.compute.manager [-] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] VM Stopped (Lifecycle Event)
Sep 30 21:53:53 compute-0 nova_compute[192810]: 2025-09-30 21:53:53.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:53 compute-0 nova_compute[192810]: 2025-09-30 21:53:53.087 2 DEBUG nova.compute.manager [None req-333d8554-c3f3-48f5-8314-98b0eeb76fc9 - - - - - -] [instance: f346a37a-3dd5-4a2b-a445-9a0fe47a9194] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:53:53 compute-0 nova_compute[192810]: 2025-09-30 21:53:53.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:54 compute-0 nova_compute[192810]: 2025-09-30 21:53:54.170 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Successfully created port: 7d22699d-19d2-418b-9ba1-d2cbd2be283e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:53:54 compute-0 sshd-session[249671]: Failed password for invalid user admin from 8.210.178.40 port 60572 ssh2
Sep 30 21:53:54 compute-0 sshd-session[249671]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 60572 ssh2 [preauth]
Sep 30 21:53:54 compute-0 sshd-session[249671]: Disconnecting invalid user admin 8.210.178.40 port 60572: Too many authentication failures [preauth]
Sep 30 21:53:54 compute-0 sshd-session[249671]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:54 compute-0 sshd-session[249671]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.564 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Successfully updated port: 7d22699d-19d2-418b-9ba1-d2cbd2be283e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.822 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.823 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquired lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.823 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.960 2 DEBUG nova.compute.manager [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.960 2 DEBUG nova.compute.manager [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing instance network info cache due to event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:53:55 compute-0 nova_compute[192810]: 2025-09-30 21:53:55.961 2 DEBUG oslo_concurrency.lockutils [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:53:56 compute-0 nova_compute[192810]: 2025-09-30 21:53:56.174 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:53:56 compute-0 sshd-session[249879]: Invalid user admin from 8.210.178.40 port 32942
Sep 30 21:53:56 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:56 compute-0 sshd-session[249879]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:53:57 compute-0 podman[249883]: 2025-09-30 21:53:57.332279751 +0000 UTC m=+0.063758738 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:53:57 compute-0 podman[249882]: 2025-09-30 21:53:57.339035062 +0000 UTC m=+0.075254729 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Sep 30 21:53:57 compute-0 podman[249881]: 2025-09-30 21:53:57.344584422 +0000 UTC m=+0.083746414 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, config_id=ovn_controller)
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.097 2 DEBUG nova.network.neutron [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.176 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Releasing lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.177 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Instance network_info: |[{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.177 2 DEBUG oslo_concurrency.lockutils [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.177 2 DEBUG nova.network.neutron [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.181 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Start _get_guest_xml network_info=[{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.184 2 WARNING nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.189 2 DEBUG nova.virt.libvirt.host [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.190 2 DEBUG nova.virt.libvirt.host [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.193 2 DEBUG nova.virt.libvirt.host [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.193 2 DEBUG nova.virt.libvirt.host [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.194 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.195 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.195 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.195 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.195 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.196 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.196 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.196 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.196 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.197 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.197 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.197 2 DEBUG nova.virt.hardware [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.200 2 DEBUG nova.virt.libvirt.vif [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-479363663',display_name='tempest-TestGettingAddress-server-479363663',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-479363663',id=181,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaKKFXezuLQJ35hTX4kX/ORJuCknLRD09XzO++7W6PCzpf9c+MCg5oCIZzvt/CMgYmfYWG3t5d/mfoP+2Waw3w3U3HORUFksDiuDVb0iBTVz8fSU+5VwnmEmsF0NEfZDw==',key_name='tempest-TestGettingAddress-1608592292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-bx4dfq05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:53:51Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=c5bbd76f-27d7-4aea-9212-994ecc27dbe5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.200 2 DEBUG nova.network.os_vif_util [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.201 2 DEBUG nova.network.os_vif_util [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.202 2 DEBUG nova.objects.instance [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'pci_devices' on Instance uuid c5bbd76f-27d7-4aea-9212-994ecc27dbe5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.298 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <uuid>c5bbd76f-27d7-4aea-9212-994ecc27dbe5</uuid>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <name>instance-000000b5</name>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:name>tempest-TestGettingAddress-server-479363663</nova:name>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:53:58</nova:creationTime>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:user uuid="5ffd1d7824fe413499994bd48b9f820f">tempest-TestGettingAddress-2056138166-project-member</nova:user>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:project uuid="71b1e8c3c45e4ff8bc99e66bd1bfef7c">tempest-TestGettingAddress-2056138166</nova:project>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         <nova:port uuid="7d22699d-19d2-418b-9ba1-d2cbd2be283e">
Sep 30 21:53:58 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe82:b63c" ipVersion="6"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe82:b63c" ipVersion="6"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <system>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="serial">c5bbd76f-27d7-4aea-9212-994ecc27dbe5</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="uuid">c5bbd76f-27d7-4aea-9212-994ecc27dbe5</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </system>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <os>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </os>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <features>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </features>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.config"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:82:b6:3c"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <target dev="tap7d22699d-19"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/console.log" append="off"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <video>
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </video>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:53:58 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:53:58 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:53:58 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:53:58 compute-0 nova_compute[192810]: </domain>
Sep 30 21:53:58 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.299 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Preparing to wait for external event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.299 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.300 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.300 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.300 2 DEBUG nova.virt.libvirt.vif [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-479363663',display_name='tempest-TestGettingAddress-server-479363663',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-479363663',id=181,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaKKFXezuLQJ35hTX4kX/ORJuCknLRD09XzO++7W6PCzpf9c+MCg5oCIZzvt/CMgYmfYWG3t5d/mfoP+2Waw3w3U3HORUFksDiuDVb0iBTVz8fSU+5VwnmEmsF0NEfZDw==',key_name='tempest-TestGettingAddress-1608592292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-bx4dfq05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:53:51Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=c5bbd76f-27d7-4aea-9212-994ecc27dbe5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.301 2 DEBUG nova.network.os_vif_util [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.301 2 DEBUG nova.network.os_vif_util [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.302 2 DEBUG os_vif [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.302 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.303 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.306 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d22699d-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.307 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d22699d-19, col_values=(('external_ids', {'iface-id': '7d22699d-19d2-418b-9ba1-d2cbd2be283e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:b6:3c', 'vm-uuid': 'c5bbd76f-27d7-4aea-9212-994ecc27dbe5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-0 NetworkManager[51733]: <info>  [1759269238.3091] manager: (tap7d22699d-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.316 2 INFO os_vif [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19')
Sep 30 21:53:58 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.481 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.481 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.481 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] No VIF found with MAC fa:16:3e:82:b6:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.482 2 INFO nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Using config drive
Sep 30 21:53:58 compute-0 nova_compute[192810]: 2025-09-30 21:53:58.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.761 2 INFO nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Creating config drive at /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.config
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.765 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxfbiqra2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:53:59 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.890 2 DEBUG oslo_concurrency.processutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxfbiqra2" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:53:59 compute-0 kernel: tap7d22699d-19: entered promiscuous mode
Sep 30 21:53:59 compute-0 NetworkManager[51733]: <info>  [1759269239.9380] manager: (tap7d22699d-19): new Tun device (/org/freedesktop/NetworkManager/Devices/324)
Sep 30 21:53:59 compute-0 ovn_controller[94912]: 2025-09-30T21:53:59Z|00733|binding|INFO|Claiming lport 7d22699d-19d2-418b-9ba1-d2cbd2be283e for this chassis.
Sep 30 21:53:59 compute-0 ovn_controller[94912]: 2025-09-30T21:53:59Z|00734|binding|INFO|7d22699d-19d2-418b-9ba1-d2cbd2be283e: Claiming fa:16:3e:82:b6:3c 10.100.0.10 2001:db8:0:1:f816:3eff:fe82:b63c 2001:db8::f816:3eff:fe82:b63c
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:59 compute-0 nova_compute[192810]: 2025-09-30 21:53:59.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:53:59 compute-0 NetworkManager[51733]: <info>  [1759269239.9523] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/325)
Sep 30 21:53:59 compute-0 NetworkManager[51733]: <info>  [1759269239.9533] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/326)
Sep 30 21:53:59 compute-0 systemd-udevd[249966]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:53:59 compute-0 systemd-machined[152794]: New machine qemu-88-instance-000000b5.
Sep 30 21:53:59 compute-0 NetworkManager[51733]: <info>  [1759269239.9797] device (tap7d22699d-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:53:59 compute-0 NetworkManager[51733]: <info>  [1759269239.9806] device (tap7d22699d-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.042 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:b6:3c 10.100.0.10 2001:db8:0:1:f816:3eff:fe82:b63c 2001:db8::f816:3eff:fe82:b63c'], port_security=['fa:16:3e:82:b6:3c 10.100.0.10 2001:db8:0:1:f816:3eff:fe82:b63c 2001:db8::f816:3eff:fe82:b63c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:fe82:b63c/64 2001:db8::f816:3eff:fe82:b63c/64', 'neutron:device_id': 'c5bbd76f-27d7-4aea-9212-994ecc27dbe5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '752da93d-dfe1-4de0-a4d1-6724679e1663', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7d22699d-19d2-418b-9ba1-d2cbd2be283e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.043 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7d22699d-19d2-418b-9ba1-d2cbd2be283e in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 bound to our chassis
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.044 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 59f11ff9-50c1-45e8-ac0d-a61faf820997
Sep 30 21:54:00 compute-0 systemd[1]: Started Virtual Machine qemu-88-instance-000000b5.
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.057 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1b1c0896-659e-47cb-83d2-2cfd5ca06f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.058 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap59f11ff9-51 in ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.060 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap59f11ff9-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.060 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[95be46ee-27b4-4c80-bb15-0bc436ddcc8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.061 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bb58e231-0684-4575-9880-7a726606632d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.071 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[2b88e30a-386d-4608-b53c-4ee0ac0ea2db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-0 ovn_controller[94912]: 2025-09-30T21:54:00Z|00735|binding|INFO|Setting lport 7d22699d-19d2-418b-9ba1-d2cbd2be283e ovn-installed in OVS
Sep 30 21:54:00 compute-0 ovn_controller[94912]: 2025-09-30T21:54:00Z|00736|binding|INFO|Setting lport 7d22699d-19d2-418b-9ba1-d2cbd2be283e up in Southbound
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.093 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e23e0eb4-129b-4163-8aff-9516a3c200c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.121 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[70a31810-acb0-4aa1-a161-8181b3e77bf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.125 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4a98f29e-45d5-42ed-84dc-c2ef1fdef5e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 systemd-udevd[249968]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:54:00 compute-0 NetworkManager[51733]: <info>  [1759269240.1281] manager: (tap59f11ff9-50): new Veth device (/org/freedesktop/NetworkManager/Devices/327)
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.160 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd0c9b7-77b9-4a69-bfce-cfb420ac027c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.164 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ba716f-138e-4bb7-8485-280831b8f43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 NetworkManager[51733]: <info>  [1759269240.1884] device (tap59f11ff9-50): carrier: link connected
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.195 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[343c5204-ddf3-4830-a145-dbb126482039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.209 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[29433a16-88b1-4a5f-8a43-5d068fb426a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59f11ff9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:75:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593581, 'reachable_time': 28489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249999, 'error': None, 'target': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.223 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[82e881bd-5721-469c-8124-9a3e7b8a7d42]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:758c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593581, 'tstamp': 593581}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250000, 'error': None, 'target': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.237 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[42be515e-906e-4b45-8cd4-2d0e8a92ab10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59f11ff9-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:75:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 221], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593581, 'reachable_time': 28489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250001, 'error': None, 'target': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.262 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[559e2bdb-01e1-4646-8058-cb548812ab21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.320 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f286c0d7-4156-47a4-bec5-a8f990de7a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.321 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59f11ff9-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.322 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.322 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59f11ff9-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:00 compute-0 NetworkManager[51733]: <info>  [1759269240.3251] manager: (tap59f11ff9-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Sep 30 21:54:00 compute-0 kernel: tap59f11ff9-50: entered promiscuous mode
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.327 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap59f11ff9-50, col_values=(('external_ids', {'iface-id': '7661d9ad-20f6-48e0-8cdf-e095331fbd29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:00 compute-0 ovn_controller[94912]: 2025-09-30T21:54:00Z|00737|binding|INFO|Releasing lport 7661d9ad-20f6-48e0-8cdf-e095331fbd29 from this chassis (sb_readonly=0)
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.330 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/59f11ff9-50c1-45e8-ac0d-a61faf820997.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/59f11ff9-50c1-45e8-ac0d-a61faf820997.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.331 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9da40349-7863-4c40-924c-2049f55b071f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.331 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-59f11ff9-50c1-45e8-ac0d-a61faf820997
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/59f11ff9-50c1-45e8-ac0d-a61faf820997.pid.haproxy
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 59f11ff9-50c1-45e8-ac0d-a61faf820997
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:54:00 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:00.332 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'env', 'PROCESS_TAG=haproxy-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/59f11ff9-50c1-45e8-ac0d-a61faf820997.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:00 compute-0 podman[250033]: 2025-09-30 21:54:00.661578833 +0000 UTC m=+0.047041494 container create 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:54:00 compute-0 systemd[1]: Started libpod-conmon-5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003.scope.
Sep 30 21:54:00 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:54:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89118cdf226f4a1d1fd0c10d270095b8e25d45b9b1b3e14895a62f2d994b3923/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:54:00 compute-0 podman[250033]: 2025-09-30 21:54:00.634626839 +0000 UTC m=+0.020089520 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:54:00 compute-0 podman[250033]: 2025-09-30 21:54:00.737052046 +0000 UTC m=+0.122514727 container init 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Sep 30 21:54:00 compute-0 podman[250033]: 2025-09-30 21:54:00.742370391 +0000 UTC m=+0.127833052 container start 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923)
Sep 30 21:54:00 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [NOTICE]   (250052) : New worker (250054) forked
Sep 30 21:54:00 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [NOTICE]   (250052) : Loading success.
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.930 2 DEBUG nova.compute.manager [req-2028c2c8-22dd-4b9d-b44e-232e203ddd45 req-3b405e20-8a06-4a08-9375-c50af5dbe2de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.931 2 DEBUG oslo_concurrency.lockutils [req-2028c2c8-22dd-4b9d-b44e-232e203ddd45 req-3b405e20-8a06-4a08-9375-c50af5dbe2de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.931 2 DEBUG oslo_concurrency.lockutils [req-2028c2c8-22dd-4b9d-b44e-232e203ddd45 req-3b405e20-8a06-4a08-9375-c50af5dbe2de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.931 2 DEBUG oslo_concurrency.lockutils [req-2028c2c8-22dd-4b9d-b44e-232e203ddd45 req-3b405e20-8a06-4a08-9375-c50af5dbe2de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:00 compute-0 nova_compute[192810]: 2025-09-30 21:54:00.932 2 DEBUG nova.compute.manager [req-2028c2c8-22dd-4b9d-b44e-232e203ddd45 req-3b405e20-8a06-4a08-9375-c50af5dbe2de dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Processing event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.607 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269241.6069987, c5bbd76f-27d7-4aea-9212-994ecc27dbe5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.607 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] VM Started (Lifecycle Event)
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.609 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.613 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.616 2 INFO nova.virt.libvirt.driver [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Instance spawned successfully.
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.616 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.640 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.643 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:01 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.832 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.832 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.833 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.833 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.833 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.834 2 DEBUG nova.virt.libvirt.driver [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.876 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.876 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269241.6071732, c5bbd76f-27d7-4aea-9212-994ecc27dbe5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.877 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] VM Paused (Lifecycle Event)
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.971 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.975 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269241.6122155, c5bbd76f-27d7-4aea-9212-994ecc27dbe5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:01 compute-0 nova_compute[192810]: 2025-09-30 21:54:01.975 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] VM Resumed (Lifecycle Event)
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.100 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.104 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.171 2 INFO nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Took 10.96 seconds to spawn the instance on the hypervisor.
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.172 2 DEBUG nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.198 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.517 2 DEBUG nova.network.neutron [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updated VIF entry in instance network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.517 2 DEBUG nova.network.neutron [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:02 compute-0 nova_compute[192810]: 2025-09-30 21:54:02.706 2 DEBUG oslo_concurrency.lockutils [req-3910b7e5-d4b7-4736-9a5f-0ff188aefb5d req-e5cb2d29-0bd4-4830-bc5b-0ded59c2ca5c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.097 2 INFO nova.compute.manager [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Took 12.40 seconds to build instance.
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.316 2 DEBUG nova.compute.manager [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.317 2 DEBUG oslo_concurrency.lockutils [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.317 2 DEBUG oslo_concurrency.lockutils [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.317 2 DEBUG oslo_concurrency.lockutils [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.318 2 DEBUG nova.compute.manager [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] No waiting events found dispatching network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.318 2 WARNING nova.compute.manager [req-f269b992-a56a-4edd-896e-505d1f9fd976 req-935ace5d-f85c-48ce-9103-b37ef55c0c9b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received unexpected event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e for instance with vm_state active and task_state None.
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.373 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.404 2 DEBUG oslo_concurrency.lockutils [None req-dc1083a7-6e7e-4428-8980-28c654ba4a67 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:03 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:03 compute-0 nova_compute[192810]: 2025-09-30 21:54:03.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:54:05 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:54:05 compute-0 nova_compute[192810]: 2025-09-30 21:54:05.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:07 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:07 compute-0 podman[250071]: 2025-09-30 21:54:07.345668474 +0000 UTC m=+0.067165144 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:54:07 compute-0 podman[250072]: 2025-09-30 21:54:07.362272885 +0000 UTC m=+0.083791885 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Sep 30 21:54:08 compute-0 nova_compute[192810]: 2025-09-30 21:54:08.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:08 compute-0 nova_compute[192810]: 2025-09-30 21:54:08.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:09 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:54:09 compute-0 nova_compute[192810]: 2025-09-30 21:54:09.296 2 DEBUG nova.compute.manager [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:09 compute-0 nova_compute[192810]: 2025-09-30 21:54:09.297 2 DEBUG nova.compute.manager [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing instance network info cache due to event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:54:09 compute-0 nova_compute[192810]: 2025-09-30 21:54:09.297 2 DEBUG oslo_concurrency.lockutils [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:09 compute-0 nova_compute[192810]: 2025-09-30 21:54:09.297 2 DEBUG oslo_concurrency.lockutils [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:09 compute-0 nova_compute[192810]: 2025-09-30 21:54:09.297 2 DEBUG nova.network.neutron [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:54:10 compute-0 nova_compute[192810]: 2025-09-30 21:54:10.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:10 compute-0 nova_compute[192810]: 2025-09-30 21:54:10.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:54:10 compute-0 nova_compute[192810]: 2025-09-30 21:54:10.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:54:10 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:11 compute-0 nova_compute[192810]: 2025-09-30 21:54:11.060 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:12 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:54:13 compute-0 nova_compute[192810]: 2025-09-30 21:54:13.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:13 compute-0 nova_compute[192810]: 2025-09-30 21:54:13.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:14 compute-0 podman[250134]: 2025-09-30 21:54:14.36000563 +0000 UTC m=+0.079618340 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:54:14 compute-0 podman[250135]: 2025-09-30 21:54:14.363381446 +0000 UTC m=+0.074291995 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:54:14 compute-0 podman[250133]: 2025-09-30 21:54:14.365300034 +0000 UTC m=+0.082782990 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:54:14 compute-0 sshd-session[249879]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:15 compute-0 ovn_controller[94912]: 2025-09-30T21:54:15Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:b6:3c 10.100.0.10
Sep 30 21:54:15 compute-0 ovn_controller[94912]: 2025-09-30T21:54:15Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:b6:3c 10.100.0.10
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.579 2 DEBUG nova.network.neutron [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updated VIF entry in instance network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.579 2 DEBUG nova.network.neutron [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.614 2 DEBUG oslo_concurrency.lockutils [req-b26dd4c0-b083-459c-a17b-78db2e0f163e req-45f3bf04-4966-4c78-96e3-8272c06f824c dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.615 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.615 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:54:15 compute-0 nova_compute[192810]: 2025-09-30 21:54:15.615 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid c5bbd76f-27d7-4aea-9212-994ecc27dbe5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:16 compute-0 sshd-session[249879]: Failed password for invalid user admin from 8.210.178.40 port 32942 ssh2
Sep 30 21:54:16 compute-0 sshd-session[249879]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 32942 ssh2 [preauth]
Sep 30 21:54:16 compute-0 sshd-session[249879]: Disconnecting invalid user admin 8.210.178.40 port 32942: Too many authentication failures [preauth]
Sep 30 21:54:16 compute-0 sshd-session[249879]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:16 compute-0 sshd-session[249879]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:16 compute-0 nova_compute[192810]: 2025-09-30 21:54:16.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:16 compute-0 nova_compute[192810]: 2025-09-30 21:54:16.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:16.625 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:16 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:16.627 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:54:17 compute-0 sshd-session[250197]: Invalid user admin from 8.210.178.40 port 33656
Sep 30 21:54:17 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:17 compute-0 sshd-session[250197]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.962 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.989 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.989 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.989 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.990 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:18 compute-0 nova_compute[192810]: 2025-09-30 21:54:18.990 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.022 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.023 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.023 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.023 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.134 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.197 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.198 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.256 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.399 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.401 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5536MB free_disk=73.13273620605469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.401 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.401 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.501 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance c5bbd76f-27d7-4aea-9212-994ecc27dbe5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.502 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.502 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.537 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.596 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.622 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:54:19 compute-0 nova_compute[192810]: 2025-09-30 21:54:19.622 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:20 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:20 compute-0 ovn_controller[94912]: 2025-09-30T21:54:20Z|00738|binding|INFO|Releasing lport 7661d9ad-20f6-48e0-8cdf-e095331fbd29 from this chassis (sb_readonly=0)
Sep 30 21:54:20 compute-0 nova_compute[192810]: 2025-09-30 21:54:20.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:21 compute-0 nova_compute[192810]: 2025-09-30 21:54:21.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:21 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:23 compute-0 nova_compute[192810]: 2025-09-30 21:54:23.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:23 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:23 compute-0 nova_compute[192810]: 2025-09-30 21:54:23.618 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:23 compute-0 nova_compute[192810]: 2025-09-30 21:54:23.618 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:54:23 compute-0 nova_compute[192810]: 2025-09-30 21:54:23.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:25 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:25.628 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:27 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.864 2 DEBUG nova.compute.manager [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.865 2 DEBUG nova.compute.manager [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing instance network info cache due to event network-changed-7d22699d-19d2-418b-9ba1-d2cbd2be283e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.865 2 DEBUG oslo_concurrency.lockutils [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.865 2 DEBUG oslo_concurrency.lockutils [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.866 2 DEBUG nova.network.neutron [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Refreshing network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.953 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.954 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.954 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.954 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.955 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.969 2 INFO nova.compute.manager [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Terminating instance
Sep 30 21:54:27 compute-0 nova_compute[192810]: 2025-09-30 21:54:27.982 2 DEBUG nova.compute.manager [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:54:28 compute-0 kernel: tap7d22699d-19 (unregistering): left promiscuous mode
Sep 30 21:54:28 compute-0 NetworkManager[51733]: <info>  [1759269268.0041] device (tap7d22699d-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:54:28 compute-0 ovn_controller[94912]: 2025-09-30T21:54:28Z|00739|binding|INFO|Releasing lport 7d22699d-19d2-418b-9ba1-d2cbd2be283e from this chassis (sb_readonly=0)
Sep 30 21:54:28 compute-0 ovn_controller[94912]: 2025-09-30T21:54:28Z|00740|binding|INFO|Setting lport 7d22699d-19d2-418b-9ba1-d2cbd2be283e down in Southbound
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 ovn_controller[94912]: 2025-09-30T21:54:28Z|00741|binding|INFO|Removing iface tap7d22699d-19 ovn-installed in OVS
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.021 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:b6:3c 10.100.0.10 2001:db8:0:1:f816:3eff:fe82:b63c 2001:db8::f816:3eff:fe82:b63c'], port_security=['fa:16:3e:82:b6:3c 10.100.0.10 2001:db8:0:1:f816:3eff:fe82:b63c 2001:db8::f816:3eff:fe82:b63c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:fe82:b63c/64 2001:db8::f816:3eff:fe82:b63c/64', 'neutron:device_id': 'c5bbd76f-27d7-4aea-9212-994ecc27dbe5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '752da93d-dfe1-4de0-a4d1-6724679e1663', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94a95e41-6166-48de-bdf2-67ffa578edb2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=7d22699d-19d2-418b-9ba1-d2cbd2be283e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.022 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 7d22699d-19d2-418b-9ba1-d2cbd2be283e in datapath 59f11ff9-50c1-45e8-ac0d-a61faf820997 unbound from our chassis
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.023 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59f11ff9-50c1-45e8-ac0d-a61faf820997, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.025 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0464ea33-8bcd-4561-965a-729ba490710f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.026 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997 namespace which is not needed anymore
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Sep 30 21:54:28 compute-0 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b5.scope: Consumed 13.841s CPU time.
Sep 30 21:54:28 compute-0 systemd-machined[152794]: Machine qemu-88-instance-000000b5 terminated.
Sep 30 21:54:28 compute-0 podman[250209]: 2025-09-30 21:54:28.079809048 +0000 UTC m=+0.047904266 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Sep 30 21:54:28 compute-0 podman[250210]: 2025-09-30 21:54:28.096157232 +0000 UTC m=+0.060986297 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.build-date=20250923, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:54:28 compute-0 podman[250206]: 2025-09-30 21:54:28.121365911 +0000 UTC m=+0.092865595 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller)
Sep 30 21:54:28 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [NOTICE]   (250052) : haproxy version is 2.8.14-c23fe91
Sep 30 21:54:28 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [NOTICE]   (250052) : path to executable is /usr/sbin/haproxy
Sep 30 21:54:28 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [WARNING]  (250052) : Exiting Master process...
Sep 30 21:54:28 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [ALERT]    (250052) : Current worker (250054) exited with code 143 (Terminated)
Sep 30 21:54:28 compute-0 neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997[250048]: [WARNING]  (250052) : All workers exited. Exiting... (0)
Sep 30 21:54:28 compute-0 systemd[1]: libpod-5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003.scope: Deactivated successfully.
Sep 30 21:54:28 compute-0 conmon[250048]: conmon 5a3c251d32d3117018fa <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003.scope/container/memory.events
Sep 30 21:54:28 compute-0 podman[250291]: 2025-09-30 21:54:28.165219103 +0000 UTC m=+0.045914385 container died 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:54:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003-userdata-shm.mount: Deactivated successfully.
Sep 30 21:54:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-89118cdf226f4a1d1fd0c10d270095b8e25d45b9b1b3e14895a62f2d994b3923-merged.mount: Deactivated successfully.
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.238 2 INFO nova.virt.libvirt.driver [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Instance destroyed successfully.
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.238 2 DEBUG nova.objects.instance [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lazy-loading 'resources' on Instance uuid c5bbd76f-27d7-4aea-9212-994ecc27dbe5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.254 2 DEBUG nova.virt.libvirt.vif [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:53:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-479363663',display_name='tempest-TestGettingAddress-server-479363663',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-479363663',id=181,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHaKKFXezuLQJ35hTX4kX/ORJuCknLRD09XzO++7W6PCzpf9c+MCg5oCIZzvt/CMgYmfYWG3t5d/mfoP+2Waw3w3U3HORUFksDiuDVb0iBTVz8fSU+5VwnmEmsF0NEfZDw==',key_name='tempest-TestGettingAddress-1608592292',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:54:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71b1e8c3c45e4ff8bc99e66bd1bfef7c',ramdisk_id='',reservation_id='r-bx4dfq05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2056138166',owner_user_name='tempest-TestGettingAddress-2056138166-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:54:02Z,user_data=None,user_id='5ffd1d7824fe413499994bd48b9f820f',uuid=c5bbd76f-27d7-4aea-9212-994ecc27dbe5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.255 2 DEBUG nova.network.os_vif_util [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converting VIF {"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.255 2 DEBUG nova.network.os_vif_util [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.256 2 DEBUG os_vif [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d22699d-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.263 2 INFO os_vif [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:b6:3c,bridge_name='br-int',has_traffic_filtering=True,id=7d22699d-19d2-418b-9ba1-d2cbd2be283e,network=Network(59f11ff9-50c1-45e8-ac0d-a61faf820997),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d22699d-19')
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.263 2 INFO nova.virt.libvirt.driver [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Deleting instance files /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5_del
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.264 2 INFO nova.virt.libvirt.driver [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Deletion of /var/lib/nova/instances/c5bbd76f-27d7-4aea-9212-994ecc27dbe5_del complete
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.376 2 INFO nova.compute.manager [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Took 0.39 seconds to destroy the instance on the hypervisor.
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.376 2 DEBUG oslo.service.loopingcall [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.377 2 DEBUG nova.compute.manager [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.377 2 DEBUG nova.network.neutron [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:54:28 compute-0 podman[250291]: 2025-09-30 21:54:28.513126585 +0000 UTC m=+0.393821867 container cleanup 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Sep 30 21:54:28 compute-0 systemd[1]: libpod-conmon-5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003.scope: Deactivated successfully.
Sep 30 21:54:28 compute-0 podman[250339]: 2025-09-30 21:54:28.574696386 +0000 UTC m=+0.040528909 container remove 5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.579 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[237e417b-8f84-493a-ac23-9e40a1ff083e]: (4, ('Tue Sep 30 09:54:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997 (5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003)\n5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003\nTue Sep 30 09:54:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997 (5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003)\n5a3c251d32d3117018facefb7a71b76be250b2c2cdf184e6f0367a52761cb003\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.581 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f53fe6b0-4b60-420b-a37d-cfcce7a0b52d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.582 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59f11ff9-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 kernel: tap59f11ff9-50: left promiscuous mode
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.600 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0b282d7d-769c-42a5-b359-6a2c15beb3d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.625 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ade26385-b4c8-449c-a581-da5e656ba00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.626 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3b0cfa-4bbf-4f2d-8319-e448dc76aead]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.644 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[61b967df-37c3-4f4b-958a-a86918e7ce38]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593573, 'reachable_time': 20800, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250354, 'error': None, 'target': 'ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d59f11ff9\x2d50c1\x2d45e8\x2dac0d\x2da61faf820997.mount: Deactivated successfully.
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.649 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-59f11ff9-50c1-45e8-ac0d-a61faf820997 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:54:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:28.649 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[1e41676f-3979-4e93-8288-256ed9478fca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:28 compute-0 nova_compute[192810]: 2025-09-30 21:54:28.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:29 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:31 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.987 2 DEBUG nova.compute.manager [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-unplugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.988 2 DEBUG oslo_concurrency.lockutils [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.988 2 DEBUG oslo_concurrency.lockutils [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.988 2 DEBUG oslo_concurrency.lockutils [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.989 2 DEBUG nova.compute.manager [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] No waiting events found dispatching network-vif-unplugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:31 compute-0 nova_compute[192810]: 2025-09-30 21:54:31.989 2 DEBUG nova.compute.manager [req-6eb7a103-c696-4797-8eac-fa8771d010e5 req-b509ef50-aa6f-489a-9f1a-841d5ea4137b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-unplugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.308 2 DEBUG nova.network.neutron [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.350 2 INFO nova.compute.manager [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Took 3.97 seconds to deallocate network for instance.
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.436 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.437 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.494 2 DEBUG nova.compute.provider_tree [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.612 2 DEBUG nova.scheduler.client.report [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.653 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.706 2 INFO nova.scheduler.client.report [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Deleted allocations for instance c5bbd76f-27d7-4aea-9212-994ecc27dbe5
Sep 30 21:54:32 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:32 compute-0 nova_compute[192810]: 2025-09-30 21:54:32.842 2 DEBUG oslo_concurrency.lockutils [None req-356a90da-da65-4f5f-8a8b-f001a8758285 5ffd1d7824fe413499994bd48b9f820f 71b1e8c3c45e4ff8bc99e66bd1bfef7c - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.889s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:33 compute-0 nova_compute[192810]: 2025-09-30 21:54:33.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:33 compute-0 nova_compute[192810]: 2025-09-30 21:54:33.584 2 DEBUG nova.network.neutron [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updated VIF entry in instance network info cache for port 7d22699d-19d2-418b-9ba1-d2cbd2be283e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:33 compute-0 nova_compute[192810]: 2025-09-30 21:54:33.584 2 DEBUG nova.network.neutron [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Updating instance_info_cache with network_info: [{"id": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "address": "fa:16:3e:82:b6:3c", "network": {"id": "59f11ff9-50c1-45e8-ac0d-a61faf820997", "bridge": "br-int", "label": "tempest-network-smoke--426153268", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe82:b63c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "71b1e8c3c45e4ff8bc99e66bd1bfef7c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d22699d-19", "ovs_interfaceid": "7d22699d-19d2-418b-9ba1-d2cbd2be283e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:33 compute-0 nova_compute[192810]: 2025-09-30 21:54:33.608 2 DEBUG oslo_concurrency.lockutils [req-a1f92a0c-bb94-448e-b8f9-a4230480e093 req-5f00a2e2-0642-4a24-925b-29a6ca41a7e7 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-c5bbd76f-27d7-4aea-9212-994ecc27dbe5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:33 compute-0 nova_compute[192810]: 2025-09-30 21:54:33.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.144 2 DEBUG nova.compute.manager [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.144 2 DEBUG oslo_concurrency.lockutils [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.145 2 DEBUG oslo_concurrency.lockutils [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.145 2 DEBUG oslo_concurrency.lockutils [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "c5bbd76f-27d7-4aea-9212-994ecc27dbe5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.145 2 DEBUG nova.compute.manager [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] No waiting events found dispatching network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.145 2 WARNING nova.compute.manager [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received unexpected event network-vif-plugged-7d22699d-19d2-418b-9ba1-d2cbd2be283e for instance with vm_state deleted and task_state None.
Sep 30 21:54:34 compute-0 nova_compute[192810]: 2025-09-30 21:54:34.146 2 DEBUG nova.compute.manager [req-ae947356-f303-48f0-8b1d-5f0f22651947 req-24392057-34d6-45c6-bee8-da70fb870fcd dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Received event network-vif-deleted-7d22699d-19d2-418b-9ba1-d2cbd2be283e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:34 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:34 compute-0 sshd-session[250197]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:36 compute-0 nova_compute[192810]: 2025-09-30 21:54:36.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:36 compute-0 sshd-session[250197]: Failed password for invalid user admin from 8.210.178.40 port 33656 ssh2
Sep 30 21:54:36 compute-0 sshd-session[250197]: error: maximum authentication attempts exceeded for invalid user admin from 8.210.178.40 port 33656 ssh2 [preauth]
Sep 30 21:54:36 compute-0 sshd-session[250197]: Disconnecting invalid user admin 8.210.178.40 port 33656: Too many authentication failures [preauth]
Sep 30 21:54:36 compute-0 sshd-session[250197]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:36 compute-0 sshd-session[250197]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:38 compute-0 sshd-session[250356]: Invalid user ubuntu from 8.210.178.40 port 34334
Sep 30 21:54:38 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:38 compute-0 sshd-session[250356]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:38 compute-0 podman[250359]: 2025-09-30 21:54:38.194082781 +0000 UTC m=+0.054033711 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=)
Sep 30 21:54:38 compute-0 podman[250358]: 2025-09-30 21:54:38.20825943 +0000 UTC m=+0.066299571 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:54:38 compute-0 nova_compute[192810]: 2025-09-30 21:54:38.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:38 compute-0 nova_compute[192810]: 2025-09-30 21:54:38.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:38.760 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:38.761 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:38.761 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:39 compute-0 nova_compute[192810]: 2025-09-30 21:54:39.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:39 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:40 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:40 compute-0 sshd-session[250355]: error: kex_exchange_identification: read: Connection timed out
Sep 30 21:54:40 compute-0 sshd-session[250355]: banner exchange: Connection from 113.240.110.90 port 39124: Connection timed out
Sep 30 21:54:42 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:42 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:43 compute-0 nova_compute[192810]: 2025-09-30 21:54:43.239 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269268.2364516, c5bbd76f-27d7-4aea-9212-994ecc27dbe5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:43 compute-0 nova_compute[192810]: 2025-09-30 21:54:43.240 2 INFO nova.compute.manager [-] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] VM Stopped (Lifecycle Event)
Sep 30 21:54:43 compute-0 nova_compute[192810]: 2025-09-30 21:54:43.279 2 DEBUG nova.compute.manager [None req-bb5790ec-0c59-4335-87ec-700f717d922f - - - - - -] [instance: c5bbd76f-27d7-4aea-9212-994ecc27dbe5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:43 compute-0 nova_compute[192810]: 2025-09-30 21:54:43.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:43 compute-0 nova_compute[192810]: 2025-09-30 21:54:43.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.911 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:54:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:54:44 compute-0 nova_compute[192810]: 2025-09-30 21:54:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:44 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:45 compute-0 podman[250403]: 2025-09-30 21:54:45.319387711 +0000 UTC m=+0.053504117 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:54:45 compute-0 podman[250402]: 2025-09-30 21:54:45.326723707 +0000 UTC m=+0.060925616 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:54:45 compute-0 podman[250401]: 2025-09-30 21:54:45.353046044 +0000 UTC m=+0.093215324 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Sep 30 21:54:46 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:48 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.571 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.572 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.603 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.781 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.782 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.796 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:54:48 compute-0 nova_compute[192810]: 2025-09-30 21:54:48.796 2 INFO nova.compute.claims [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:54:48 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.123 2 DEBUG nova.compute.provider_tree [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.137 2 DEBUG nova.scheduler.client.report [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.177 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.178 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.294 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.295 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.326 2 INFO nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.451 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.577 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.579 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.579 2 INFO nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Creating image(s)
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.579 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.580 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.581 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.592 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.638 2 DEBUG nova.policy [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '185cc8ad7e1445d2ab5006153ab19700', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.648 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.649 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.649 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.660 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.715 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.716 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.749 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a,backing_fmt=raw /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.750 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "e0a114b373fedfcc318870f9bde30baf716d5a2a" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.750 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.802 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e0a114b373fedfcc318870f9bde30baf716d5a2a --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.803 2 DEBUG nova.virt.disk.api [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Checking if we can resize image /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.804 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.860 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.861 2 DEBUG nova.virt.disk.api [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Cannot resize image /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.862 2 DEBUG nova.objects.instance [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'migration_context' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.886 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.886 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Ensure instance console log exists: /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.887 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.887 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:49 compute-0 nova_compute[192810]: 2025-09-30 21:54:49.887 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:51 compute-0 nova_compute[192810]: 2025-09-30 21:54:51.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:51 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:51 compute-0 nova_compute[192810]: 2025-09-30 21:54:51.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:51 compute-0 nova_compute[192810]: 2025-09-30 21:54:51.476 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Successfully created port: ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:54:53 compute-0 sshd-session[250356]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.320 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Successfully updated port: ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.365 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.366 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.366 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.560 2 DEBUG nova.compute.manager [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.561 2 DEBUG nova.compute.manager [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing instance network info cache due to event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.561 2 DEBUG oslo_concurrency.lockutils [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:54:53 compute-0 nova_compute[192810]: 2025-09-30 21:54:53.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:54 compute-0 nova_compute[192810]: 2025-09-30 21:54:54.078 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:54:55 compute-0 sshd-session[250356]: Failed password for invalid user ubuntu from 8.210.178.40 port 34334 ssh2
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.519 2 DEBUG nova.network.neutron [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.535 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.536 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance network_info: |[{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.537 2 DEBUG oslo_concurrency.lockutils [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.537 2 DEBUG nova.network.neutron [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.540 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Start _get_guest_xml network_info=[{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '86b6907c-d747-4e98-8897-42105915831d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.544 2 WARNING nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.548 2 DEBUG nova.virt.libvirt.host [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.549 2 DEBUG nova.virt.libvirt.host [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.551 2 DEBUG nova.virt.libvirt.host [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.552 2 DEBUG nova.virt.libvirt.host [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.553 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.554 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T21:15:08Z,direct_url=<?>,disk_format='qcow2',id=86b6907c-d747-4e98-8897-42105915831d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5fb5d4b07ed54e6cb716f880185e34d5',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T21:15:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.554 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.554 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.555 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.555 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.555 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.555 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.556 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.556 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.556 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.557 2 DEBUG nova.virt.hardware [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.560 2 DEBUG nova.virt.libvirt.vif [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1048714053',display_name='tempest-TestNetworkAdvancedServerOps-server-1048714053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1048714053',id=183,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJausXmCoycNdXcrwJ/z2+iqGqSQ4qE5NNNaxnUrCp8xA543XlWTkIjz2Q7FVEsF1RcAtqLbp9Nk3Qb/yWSMZn8lsL4y/ez1BlHhptUsfXY/Ihxv4Dxwf0YYEI+dJFl0qA==',key_name='tempest-TestNetworkAdvancedServerOps-1891871344',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-hg1e5fd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:54:49Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=0decb8c3-82ea-4251-8377-e18a207b6093,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.561 2 DEBUG nova.network.os_vif_util [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.562 2 DEBUG nova.network.os_vif_util [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.563 2 DEBUG nova.objects.instance [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.579 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <uuid>0decb8c3-82ea-4251-8377-e18a207b6093</uuid>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <name>instance-000000b7</name>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1048714053</nova:name>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:54:55</nova:creationTime>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:user uuid="185cc8ad7e1445d2ab5006153ab19700">tempest-TestNetworkAdvancedServerOps-374190229-project-member</nova:user>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:project uuid="075b1efc4c8e4cb1b28d61b042c451e9">tempest-TestNetworkAdvancedServerOps-374190229</nova:project>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="86b6907c-d747-4e98-8897-42105915831d"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         <nova:port uuid="ebfc0c9a-67e0-40a1-abf3-105c7c3435b7">
Sep 30 21:54:55 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <system>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="serial">0decb8c3-82ea-4251-8377-e18a207b6093</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="uuid">0decb8c3-82ea-4251-8377-e18a207b6093</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </system>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <os>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </os>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <features>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </features>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.config"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:24:c3:46"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <target dev="tapebfc0c9a-67"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/console.log" append="off"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <video>
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </video>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:54:55 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:54:55 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:54:55 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:54:55 compute-0 nova_compute[192810]: </domain>
Sep 30 21:54:55 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.580 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Preparing to wait for external event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.580 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.581 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.581 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.581 2 DEBUG nova.virt.libvirt.vif [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1048714053',display_name='tempest-TestNetworkAdvancedServerOps-server-1048714053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1048714053',id=183,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJausXmCoycNdXcrwJ/z2+iqGqSQ4qE5NNNaxnUrCp8xA543XlWTkIjz2Q7FVEsF1RcAtqLbp9Nk3Qb/yWSMZn8lsL4y/ez1BlHhptUsfXY/Ihxv4Dxwf0YYEI+dJFl0qA==',key_name='tempest-TestNetworkAdvancedServerOps-1891871344',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-hg1e5fd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:54:49Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=0decb8c3-82ea-4251-8377-e18a207b6093,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.582 2 DEBUG nova.network.os_vif_util [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.582 2 DEBUG nova.network.os_vif_util [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.582 2 DEBUG os_vif [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.583 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.583 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebfc0c9a-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebfc0c9a-67, col_values=(('external_ids', {'iface-id': 'ebfc0c9a-67e0-40a1-abf3-105c7c3435b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:c3:46', 'vm-uuid': '0decb8c3-82ea-4251-8377-e18a207b6093'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-0 NetworkManager[51733]: <info>  [1759269295.5887] manager: (tapebfc0c9a-67): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/329)
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.595 2 INFO os_vif [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67')
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.650 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.651 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.651 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] No VIF found with MAC fa:16:3e:24:c3:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:54:55 compute-0 nova_compute[192810]: 2025-09-30 21:54:55.652 2 INFO nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Using config drive
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.257 2 INFO nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Creating config drive at /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.config
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.262 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1sgi9dnh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.390 2 DEBUG oslo_concurrency.processutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1sgi9dnh" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:54:56 compute-0 kernel: tapebfc0c9a-67: entered promiscuous mode
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.4545] manager: (tapebfc0c9a-67): new Tun device (/org/freedesktop/NetworkManager/Devices/330)
Sep 30 21:54:56 compute-0 ovn_controller[94912]: 2025-09-30T21:54:56Z|00742|binding|INFO|Claiming lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for this chassis.
Sep 30 21:54:56 compute-0 ovn_controller[94912]: 2025-09-30T21:54:56Z|00743|binding|INFO|ebfc0c9a-67e0-40a1-abf3-105c7c3435b7: Claiming fa:16:3e:24:c3:46 10.100.0.7
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.467 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.469 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c bound to our chassis
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.471 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:54:56 compute-0 systemd-machined[152794]: New machine qemu-89-instance-000000b7.
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.483 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[25c562be-21e2-49ea-b66d-2d5d94987186]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.484 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap724cab50-b1 in ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.486 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap724cab50-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.486 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6fccfa3f-b471-4e6d-8519-e36d13652181]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.487 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2ac5ea-a0a8-4fde-aab0-48e4805ea642]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.496 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1e81b6-b78b-43fc-a808-4c208de80d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 systemd[1]: Started Virtual Machine qemu-89-instance-000000b7.
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.510 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[60d9254e-37ce-47e1-a938-d73f5d7cb2d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_controller[94912]: 2025-09-30T21:54:56Z|00744|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 ovn-installed in OVS
Sep 30 21:54:56 compute-0 ovn_controller[94912]: 2025-09-30T21:54:56Z|00745|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 up in Southbound
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 systemd-udevd[250502]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.5342] device (tapebfc0c9a-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.5354] device (tapebfc0c9a-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.538 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[40617fdc-1857-45ea-98c4-e9f57a52a111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 systemd-udevd[250505]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.5440] manager: (tap724cab50-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/331)
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.542 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8bac9d-2119-4444-93f4-dabd940ddcad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.545 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.575 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[53c8a873-51a3-44de-a5bf-f72be39afac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.578 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed9f216-d053-4506-8eb4-c3dbd9b0ed1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.5981] device (tap724cab50-b0): carrier: link connected
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.602 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[807900d1-3c05-42e3-b6ae-56a5b22bbc07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.619 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f312b758-8aaa-4a43-948e-6c340312c18a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap724cab50-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:61:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599222, 'reachable_time': 39502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250531, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.634 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e236dec8-6dac-4a94-a763-408ae0e43d30]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:61a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 599222, 'tstamp': 599222}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250532, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.650 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5c2bc55c-20b4-4e79-82c4-c3d598d0355f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap724cab50-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:61:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 224], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599222, 'reachable_time': 39502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250533, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.681 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe904e3-890d-49f0-b702-445ec968ded4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.743 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9bface-50f9-4547-b579-e3dc1549c6d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.744 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap724cab50-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.745 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.745 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap724cab50-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-0 kernel: tap724cab50-b0: entered promiscuous mode
Sep 30 21:54:56 compute-0 NetworkManager[51733]: <info>  [1759269296.7483] manager: (tap724cab50-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.750 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap724cab50-b0, col_values=(('external_ids', {'iface-id': '4630c540-3db6-41e7-8e0e-b05a231d7e73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_controller[94912]: 2025-09-30T21:54:56Z|00746|binding|INFO|Releasing lport 4630c540-3db6-41e7-8e0e-b05a231d7e73 from this chassis (sb_readonly=0)
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.755 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.756 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3e91cf-23c5-4d77-844c-d557c2a3a377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.757 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:54:56 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:56.759 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'env', 'PROCESS_TAG=haproxy-724cab50-b368-4e40-a600-414c68f09e7c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/724cab50-b368-4e40-a600-414c68f09e7c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.873 2 DEBUG nova.compute.manager [req-8ea40cb5-0172-4d57-b0c0-19b987f1b841 req-74222f31-d1a2-4876-ade8-8a81ae7c06b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.874 2 DEBUG oslo_concurrency.lockutils [req-8ea40cb5-0172-4d57-b0c0-19b987f1b841 req-74222f31-d1a2-4876-ade8-8a81ae7c06b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.874 2 DEBUG oslo_concurrency.lockutils [req-8ea40cb5-0172-4d57-b0c0-19b987f1b841 req-74222f31-d1a2-4876-ade8-8a81ae7c06b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.874 2 DEBUG oslo_concurrency.lockutils [req-8ea40cb5-0172-4d57-b0c0-19b987f1b841 req-74222f31-d1a2-4876-ade8-8a81ae7c06b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:56 compute-0 nova_compute[192810]: 2025-09-30 21:54:56.875 2 DEBUG nova.compute.manager [req-8ea40cb5-0172-4d57-b0c0-19b987f1b841 req-74222f31-d1a2-4876-ade8-8a81ae7c06b3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Processing event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:54:56 compute-0 sshd-session[250356]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 34334 ssh2 [preauth]
Sep 30 21:54:56 compute-0 sshd-session[250356]: Disconnecting invalid user ubuntu 8.210.178.40 port 34334: Too many authentication failures [preauth]
Sep 30 21:54:56 compute-0 sshd-session[250356]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:56 compute-0 sshd-session[250356]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.087 2 DEBUG nova.network.neutron [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updated VIF entry in instance network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.088 2 DEBUG nova.network.neutron [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.112 2 DEBUG oslo_concurrency.lockutils [req-33b40b72-9305-41a7-8039-aea9a9df3f35 req-fa24a6d2-831f-4bd9-8887-edd9c7ca417a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:54:57 compute-0 podman[250571]: 2025-09-30 21:54:57.146683137 +0000 UTC m=+0.054439162 container create c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2)
Sep 30 21:54:57 compute-0 systemd[1]: Started libpod-conmon-c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13.scope.
Sep 30 21:54:57 compute-0 podman[250571]: 2025-09-30 21:54:57.117953658 +0000 UTC m=+0.025709693 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:54:57 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:54:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c223fa2b5bd6c713f6426c1616222d3f73b711426146c14b7451aa28cfe0d795/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:54:57 compute-0 podman[250571]: 2025-09-30 21:54:57.250129259 +0000 UTC m=+0.157885304 container init c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 21:54:57 compute-0 podman[250571]: 2025-09-30 21:54:57.255341362 +0000 UTC m=+0.163097377 container start c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:54:57 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [NOTICE]   (250592) : New worker (250594) forked
Sep 30 21:54:57 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [NOTICE]   (250592) : Loading success.
Sep 30 21:54:57 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:54:57.307 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.538 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.539 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269297.5380516, 0decb8c3-82ea-4251-8377-e18a207b6093 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.539 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Started (Lifecycle Event)
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.543 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.546 2 INFO nova.virt.libvirt.driver [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance spawned successfully.
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.546 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.569 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.575 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.578 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.578 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.579 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.579 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.579 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.580 2 DEBUG nova.virt.libvirt.driver [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.613 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.613 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269297.5391064, 0decb8c3-82ea-4251-8377-e18a207b6093 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.613 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Paused (Lifecycle Event)
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.644 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.647 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269297.5424187, 0decb8c3-82ea-4251-8377-e18a207b6093 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.647 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Resumed (Lifecycle Event)
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.673 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.674 2 INFO nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Took 8.10 seconds to spawn the instance on the hypervisor.
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.675 2 DEBUG nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.678 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.717 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.762 2 INFO nova.compute.manager [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Took 9.05 seconds to build instance.
Sep 30 21:54:57 compute-0 nova_compute[192810]: 2025-09-30 21:54:57.788 2 DEBUG oslo_concurrency.lockutils [None req-b253376f-722f-46e9-8e4d-a18fd2bff5c9 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:58 compute-0 podman[250606]: 2025-09-30 21:54:58.320318824 +0000 UTC m=+0.052647176 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:54:58 compute-0 podman[250607]: 2025-09-30 21:54:58.334578645 +0000 UTC m=+0.061366657 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:54:58 compute-0 podman[250605]: 2025-09-30 21:54:58.351270769 +0000 UTC m=+0.078383899 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:54:58 compute-0 sshd-session[250603]: Invalid user ubuntu from 8.210.178.40 port 34982
Sep 30 21:54:58 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:54:58 compute-0 sshd-session[250603]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.974 2 DEBUG nova.compute.manager [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.974 2 DEBUG oslo_concurrency.lockutils [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.974 2 DEBUG oslo_concurrency.lockutils [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.975 2 DEBUG oslo_concurrency.lockutils [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.975 2 DEBUG nova.compute.manager [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:54:58 compute-0 nova_compute[192810]: 2025-09-30 21:54:58.975 2 WARNING nova.compute.manager [req-23209137-7a5e-4585-9a37-5e202836cf50 req-89e58946-5224-4401-bfa1-38166d5babdc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state active and task_state None.
Sep 30 21:55:00 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:00 compute-0 nova_compute[192810]: 2025-09-30 21:55:00.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:00 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:01 compute-0 NetworkManager[51733]: <info>  [1759269301.2434] manager: (patch-br-int-to-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/333)
Sep 30 21:55:01 compute-0 NetworkManager[51733]: <info>  [1759269301.2443] manager: (patch-provnet-9d12d22a-d8f3-49f8-bf97-88cbf8c3d41a-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/334)
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:01 compute-0 ovn_controller[94912]: 2025-09-30T21:55:01Z|00747|binding|INFO|Releasing lport 4630c540-3db6-41e7-8e0e-b05a231d7e73 from this chassis (sb_readonly=0)
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.448 2 DEBUG nova.compute.manager [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.449 2 DEBUG nova.compute.manager [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing instance network info cache due to event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.449 2 DEBUG oslo_concurrency.lockutils [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.449 2 DEBUG oslo_concurrency.lockutils [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:01 compute-0 nova_compute[192810]: 2025-09-30 21:55:01.450 2 DEBUG nova.network.neutron [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:55:02 compute-0 nova_compute[192810]: 2025-09-30 21:55:02.738 2 DEBUG nova.network.neutron [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updated VIF entry in instance network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:55:02 compute-0 nova_compute[192810]: 2025-09-30 21:55:02.739 2 DEBUG nova.network.neutron [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:02 compute-0 nova_compute[192810]: 2025-09-30 21:55:02.766 2 DEBUG oslo_concurrency.lockutils [req-5e3a8ee6-5e22-498a-a519-50bc14817013 req-688ab2bd-8f05-4b96-9977-22b803594574 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:02 compute-0 nova_compute[192810]: 2025-09-30 21:55:02.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:03 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:03 compute-0 nova_compute[192810]: 2025-09-30 21:55:03.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:03 compute-0 nova_compute[192810]: 2025-09-30 21:55:03.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:04 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:05 compute-0 nova_compute[192810]: 2025-09-30 21:55:05.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:05 compute-0 nova_compute[192810]: 2025-09-30 21:55:05.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:05 compute-0 nova_compute[192810]: 2025-09-30 21:55:05.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:55:06 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:06.310 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:07 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:07 compute-0 nova_compute[192810]: 2025-09-30 21:55:07.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:07 compute-0 nova_compute[192810]: 2025-09-30 21:55:07.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:07 compute-0 nova_compute[192810]: 2025-09-30 21:55:07.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 21:55:08 compute-0 podman[250672]: 2025-09-30 21:55:08.332067867 +0000 UTC m=+0.058623327 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 21:55:08 compute-0 podman[250671]: 2025-09-30 21:55:08.339315751 +0000 UTC m=+0.065355998 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:55:08 compute-0 nova_compute[192810]: 2025-09-30 21:55:08.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:09 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:09 compute-0 ovn_controller[94912]: 2025-09-30T21:55:09Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:c3:46 10.100.0.7
Sep 30 21:55:09 compute-0 ovn_controller[94912]: 2025-09-30T21:55:09Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:c3:46 10.100.0.7
Sep 30 21:55:10 compute-0 nova_compute[192810]: 2025-09-30 21:55:10.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:10 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:11 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:11 compute-0 nova_compute[192810]: 2025-09-30 21:55:11.803 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:11 compute-0 nova_compute[192810]: 2025-09-30 21:55:11.803 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:55:11 compute-0 nova_compute[192810]: 2025-09-30 21:55:11.803 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:55:12 compute-0 nova_compute[192810]: 2025-09-30 21:55:12.010 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:12 compute-0 nova_compute[192810]: 2025-09-30 21:55:12.010 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:12 compute-0 nova_compute[192810]: 2025-09-30 21:55:12.010 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:55:12 compute-0 nova_compute[192810]: 2025-09-30 21:55:12.011 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:12.029 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8::f816:3eff:fec0:3830'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=43244ebf-4de3-435e-a1f4-6cebae949e6e) old=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:12.031 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 43244ebf-4de3-435e-a1f4-6cebae949e6e in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 updated
Sep 30 21:55:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:12.034 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:12.036 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0218f2af-e2ae-4ff7-ac74-efb6154dcb8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:13 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:13 compute-0 nova_compute[192810]: 2025-09-30 21:55:13.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.019 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.038 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.039 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.039 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.040 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.055 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 21:55:14 compute-0 nova_compute[192810]: 2025-09-30 21:55:14.802 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:15 compute-0 sshd-session[250603]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:15 compute-0 podman[250731]: 2025-09-30 21:55:15.585471064 +0000 UTC m=+0.049932577 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 21:55:15 compute-0 podman[250730]: 2025-09-30 21:55:15.589507096 +0000 UTC m=+0.053780845 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Sep 30 21:55:15 compute-0 nova_compute[192810]: 2025-09-30 21:55:15.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:15 compute-0 podman[250732]: 2025-09-30 21:55:15.603603833 +0000 UTC m=+0.063840489 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:55:16 compute-0 nova_compute[192810]: 2025-09-30 21:55:16.683 2 INFO nova.compute.manager [None req-2cf64877-15dd-4ca3-bb0a-c0e7f995b054 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Get console output
Sep 30 21:55:16 compute-0 nova_compute[192810]: 2025-09-30 21:55:16.690 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:55:16 compute-0 nova_compute[192810]: 2025-09-30 21:55:16.784 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:16 compute-0 nova_compute[192810]: 2025-09-30 21:55:16.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.052 2 DEBUG nova.objects.instance [None req-ef2bd6fa-f6fd-4ce7-b73e-2c3ec4f61b7f 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.074 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269317.0745907, 0decb8c3-82ea-4251-8377-e18a207b6093 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.075 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Paused (Lifecycle Event)
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.102 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.108 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.158 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] During sync_power_state the instance has a pending task (suspending). Skip.
Sep 30 21:55:17 compute-0 sshd-session[250603]: Failed password for invalid user ubuntu from 8.210.178.40 port 34982 ssh2
Sep 30 21:55:17 compute-0 kernel: tapebfc0c9a-67 (unregistering): left promiscuous mode
Sep 30 21:55:17 compute-0 NetworkManager[51733]: <info>  [1759269317.7853] device (tapebfc0c9a-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:17 compute-0 ovn_controller[94912]: 2025-09-30T21:55:17Z|00748|binding|INFO|Releasing lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 from this chassis (sb_readonly=0)
Sep 30 21:55:17 compute-0 ovn_controller[94912]: 2025-09-30T21:55:17Z|00749|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 down in Southbound
Sep 30 21:55:17 compute-0 ovn_controller[94912]: 2025-09-30T21:55:17Z|00750|binding|INFO|Removing iface tapebfc0c9a-67 ovn-installed in OVS
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.810 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.811 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c unbound from our chassis
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.812 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 724cab50-b368-4e40-a600-414c68f09e7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.814 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7791ff45-134b-461b-991e-3f52cdcafdd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.814 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c namespace which is not needed anymore
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:17 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Sep 30 21:55:17 compute-0 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b7.scope: Consumed 12.887s CPU time.
Sep 30 21:55:17 compute-0 systemd-machined[152794]: Machine qemu-89-instance-000000b7 terminated.
Sep 30 21:55:17 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [NOTICE]   (250592) : haproxy version is 2.8.14-c23fe91
Sep 30 21:55:17 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [NOTICE]   (250592) : path to executable is /usr/sbin/haproxy
Sep 30 21:55:17 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [WARNING]  (250592) : Exiting Master process...
Sep 30 21:55:17 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [ALERT]    (250592) : Current worker (250594) exited with code 143 (Terminated)
Sep 30 21:55:17 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[250588]: [WARNING]  (250592) : All workers exited. Exiting... (0)
Sep 30 21:55:17 compute-0 systemd[1]: libpod-c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13.scope: Deactivated successfully.
Sep 30 21:55:17 compute-0 podman[250824]: 2025-09-30 21:55:17.9608714 +0000 UTC m=+0.052404069 container died c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:55:17 compute-0 NetworkManager[51733]: <info>  [1759269317.9772] manager: (tapebfc0c9a-67): new Tun device (/org/freedesktop/NetworkManager/Devices/335)
Sep 30 21:55:17 compute-0 kernel: tapebfc0c9a-67: entered promiscuous mode
Sep 30 21:55:17 compute-0 systemd-udevd[250801]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:55:17 compute-0 kernel: tapebfc0c9a-67 (unregistering): left promiscuous mode
Sep 30 21:55:17 compute-0 ovn_controller[94912]: 2025-09-30T21:55:17Z|00751|binding|INFO|Claiming lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for this chassis.
Sep 30 21:55:17 compute-0 nova_compute[192810]: 2025-09-30 21:55:17.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:17 compute-0 ovn_controller[94912]: 2025-09-30T21:55:17Z|00752|binding|INFO|ebfc0c9a-67e0-40a1-abf3-105c7c3435b7: Claiming fa:16:3e:24:c3:46 10.100.0.7
Sep 30 21:55:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13-userdata-shm.mount: Deactivated successfully.
Sep 30 21:55:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-c223fa2b5bd6c713f6426c1616222d3f73b711426146c14b7451aa28cfe0d795-merged.mount: Deactivated successfully.
Sep 30 21:55:17 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:17.997 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00753|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 ovn-installed in OVS
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00754|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 up in Southbound
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00755|binding|INFO|Releasing lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 from this chassis (sb_readonly=1)
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00756|if_status|INFO|Dropped 2 log messages in last 605 seconds (most recently, 605 seconds ago) due to excessive rate
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00757|if_status|INFO|Not setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 down as sb is readonly
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00758|binding|INFO|Removing iface tapebfc0c9a-67 ovn-installed in OVS
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 podman[250824]: 2025-09-30 21:55:18.00699323 +0000 UTC m=+0.098525899 container cleanup c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00759|binding|INFO|Releasing lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 from this chassis (sb_readonly=0)
Sep 30 21:55:18 compute-0 ovn_controller[94912]: 2025-09-30T21:55:18Z|00760|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 down in Southbound
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.016 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:18 compute-0 systemd[1]: libpod-conmon-c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13.scope: Deactivated successfully.
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.029 2 DEBUG nova.compute.manager [None req-ef2bd6fa-f6fd-4ce7-b73e-2c3ec4f61b7f 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:18 compute-0 podman[250864]: 2025-09-30 21:55:18.077141018 +0000 UTC m=+0.045472274 container remove c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.092 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[240ed6e7-53b4-4bae-bda4-9ceaa8f23d8c]: (4, ('Tue Sep 30 09:55:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c (c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13)\nc58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13\nTue Sep 30 09:55:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c (c58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13)\nc58720b01ef16d6e9aafbf12413f5a6481c51dff1ad2a1662a4ba4befa0ebd13\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.094 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[8160486b-55a0-4cf4-8614-de2cdd72ce2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.095 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap724cab50-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 kernel: tap724cab50-b0: left promiscuous mode
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.150 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0a381ab5-5fa6-45eb-9208-7182fde9eae7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.181 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d32265a5-74fa-4930-a3ab-597e16f0a675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.182 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[04948e7d-24c3-46e8-825f-4624d534cbb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.196 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ef49d3db-4cdb-4dd4-8692-d89704883fec]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 599215, 'reachable_time': 37657, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250885, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 systemd[1]: run-netns-ovnmeta\x2d724cab50\x2db368\x2d4e40\x2da600\x2d414c68f09e7c.mount: Deactivated successfully.
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.198 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.198 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3af612-e407-43ce-8a72-2857a22603cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.199 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c unbound from our chassis
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.200 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 724cab50-b368-4e40-a600-414c68f09e7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.200 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fe45f198-7434-4851-9ef8-e9a32f890811]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.201 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c unbound from our chassis
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.202 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 724cab50-b368-4e40-a600-414c68f09e7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.202 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[119abdaf-8e8e-4c8d-aa2e-f75f7d182c0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.297 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8:0:1:f816:3eff:fec0:3830 2001:db8::f816:3eff:fec0:3830'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec0:3830/64 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f488970f-e2de-4ce9-9091-bbb5c56f0cb2, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=43244ebf-4de3-435e-a1f4-6cebae949e6e) old=Port_Binding(mac=['fa:16:3e:c0:38:30 10.100.0.2 2001:db8::f816:3eff:fec0:3830'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec0:3830/64', 'neutron:device_id': 'ovnmeta-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-412a19ab-c94f-46ed-9c4e-c69fc7962be3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.298 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 43244ebf-4de3-435e-a1f4-6cebae949e6e in datapath 412a19ab-c94f-46ed-9c4e-c69fc7962be3 updated
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.299 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 412a19ab-c94f-46ed-9c4e-c69fc7962be3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:18.299 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5556709d-922a-4f6e-a6ea-4d5a56321250]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.815 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.864 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.921 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.922 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:18 compute-0 nova_compute[192810]: 2025-09-30 21:55:18.976 2 DEBUG oslo_concurrency.processutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.123 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.125 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5705MB free_disk=73.02674102783203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.125 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.125 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.210 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Instance 0decb8c3-82ea-4251-8377-e18a207b6093 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.211 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.211 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.322 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.348 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.368 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:55:19 compute-0 nova_compute[192810]: 2025-09-30 21:55:19.368 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:19 compute-0 sshd-session[250603]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 34982 ssh2 [preauth]
Sep 30 21:55:19 compute-0 sshd-session[250603]: Disconnecting invalid user ubuntu 8.210.178.40 port 34982: Too many authentication failures [preauth]
Sep 30 21:55:19 compute-0 sshd-session[250603]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:19 compute-0 sshd-session[250603]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:55:20 compute-0 nova_compute[192810]: 2025-09-30 21:55:20.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:20 compute-0 nova_compute[192810]: 2025-09-30 21:55:20.819 2 INFO nova.compute.manager [None req-0e13cea6-b006-490b-b04a-062ff85f313c 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Get console output
Sep 30 21:55:21 compute-0 sshd-session[250893]: Invalid user ubuntu from 8.210.178.40 port 35696
Sep 30 21:55:21 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:21 compute-0 sshd-session[250893]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:21 compute-0 nova_compute[192810]: 2025-09-30 21:55:21.136 2 INFO nova.compute.manager [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Resuming
Sep 30 21:55:21 compute-0 nova_compute[192810]: 2025-09-30 21:55:21.137 2 DEBUG nova.objects.instance [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'flavor' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:21 compute-0 nova_compute[192810]: 2025-09-30 21:55:21.184 2 DEBUG oslo_concurrency.lockutils [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:21 compute-0 nova_compute[192810]: 2025-09-30 21:55:21.185 2 DEBUG oslo_concurrency.lockutils [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:21 compute-0 nova_compute[192810]: 2025-09-30 21:55:21.185 2 DEBUG nova.network.neutron [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.502 2 DEBUG nova.compute.manager [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.503 2 DEBUG oslo_concurrency.lockutils [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.504 2 DEBUG oslo_concurrency.lockutils [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.504 2 DEBUG oslo_concurrency.lockutils [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.504 2 DEBUG nova.compute.manager [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.504 2 WARNING nova.compute.manager [req-6c81b17e-45d9-4e84-92ec-675552e8f2ef req-fc030f48-855d-4b63-9bcc-a43ad5fcba0a dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state suspended and task_state resuming.
Sep 30 21:55:22 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:22 compute-0 nova_compute[192810]: 2025-09-30 21:55:22.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:55:23 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:23 compute-0 nova_compute[192810]: 2025-09-30 21:55:23.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.517 2 DEBUG nova.network.neutron [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.536 2 DEBUG oslo_concurrency.lockutils [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.542 2 DEBUG nova.virt.libvirt.vif [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1048714053',display_name='tempest-TestNetworkAdvancedServerOps-server-1048714053',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1048714053',id=183,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJausXmCoycNdXcrwJ/z2+iqGqSQ4qE5NNNaxnUrCp8xA543XlWTkIjz2Q7FVEsF1RcAtqLbp9Nk3Qb/yWSMZn8lsL4y/ez1BlHhptUsfXY/Ihxv4Dxwf0YYEI+dJFl0qA==',key_name='tempest-TestNetworkAdvancedServerOps-1891871344',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:54:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-hg1e5fd1',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:55:18Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=0decb8c3-82ea-4251-8377-e18a207b6093,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.543 2 DEBUG nova.network.os_vif_util [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.544 2 DEBUG nova.network.os_vif_util [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.545 2 DEBUG os_vif [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebfc0c9a-67, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.552 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebfc0c9a-67, col_values=(('external_ids', {'iface-id': 'ebfc0c9a-67e0-40a1-abf3-105c7c3435b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:c3:46', 'vm-uuid': '0decb8c3-82ea-4251-8377-e18a207b6093'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.553 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.554 2 INFO os_vif [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67')
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.577 2 DEBUG nova.objects.instance [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.605 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.606 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.606 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.607 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.607 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.607 2 WARNING nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state suspended and task_state resuming.
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.608 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.608 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.608 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.608 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.609 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.609 2 WARNING nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state suspended and task_state resuming.
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.609 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.609 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.610 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.610 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.611 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.611 2 WARNING nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state suspended and task_state resuming.
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.611 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.612 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.612 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.612 2 DEBUG oslo_concurrency.lockutils [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.613 2 DEBUG nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.613 2 WARNING nova.compute.manager [req-46b7e5b0-b2ee-415c-ba9d-56f3453ccc46 req-ff974e86-f0ff-404c-aa6a-e695a7482ca3 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state suspended and task_state resuming.
Sep 30 21:55:24 compute-0 kernel: tapebfc0c9a-67: entered promiscuous mode
Sep 30 21:55:24 compute-0 NetworkManager[51733]: <info>  [1759269324.6613] manager: (tapebfc0c9a-67): new Tun device (/org/freedesktop/NetworkManager/Devices/336)
Sep 30 21:55:24 compute-0 ovn_controller[94912]: 2025-09-30T21:55:24Z|00761|binding|INFO|Claiming lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for this chassis.
Sep 30 21:55:24 compute-0 ovn_controller[94912]: 2025-09-30T21:55:24Z|00762|binding|INFO|ebfc0c9a-67e0-40a1-abf3-105c7c3435b7: Claiming fa:16:3e:24:c3:46 10.100.0.7
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.674 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.675 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c bound to our chassis
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.677 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:55:24 compute-0 ovn_controller[94912]: 2025-09-30T21:55:24Z|00763|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 ovn-installed in OVS
Sep 30 21:55:24 compute-0 ovn_controller[94912]: 2025-09-30T21:55:24Z|00764|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 up in Southbound
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 nova_compute[192810]: 2025-09-30 21:55:24.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.690 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2c91519b-3df8-4b70-9893-4f3bf346cc2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.691 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap724cab50-b1 in ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.693 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap724cab50-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.693 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d90d35a5-6567-4458-b4a7-60a6802c82bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.695 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7662abcd-b2f6-4377-8b56-f37706a82247]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 systemd-udevd[250909]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.705 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[73816eef-64b0-4430-af62-7b4c27abf1a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 NetworkManager[51733]: <info>  [1759269324.7204] device (tapebfc0c9a-67): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:55:24 compute-0 NetworkManager[51733]: <info>  [1759269324.7218] device (tapebfc0c9a-67): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:55:24 compute-0 systemd-machined[152794]: New machine qemu-90-instance-000000b7.
Sep 30 21:55:24 compute-0 systemd[1]: Started Virtual Machine qemu-90-instance-000000b7.
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.730 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[fd694068-014b-49a0-95d4-ab6e2d730caf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.760 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[463ffbfe-2474-4fbb-bee7-bf9e84ffc4f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.767 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3295e3-33f1-41cd-a4bb-6cf15eb23156]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 systemd-udevd[250915]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:55:24 compute-0 NetworkManager[51733]: <info>  [1759269324.7684] manager: (tap724cab50-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/337)
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.806 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[ad3bc766-94e0-4c75-b593-f5c6d8805a69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.811 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[9183cb27-d648-4d8d-84a4-823b5824d401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 NetworkManager[51733]: <info>  [1759269324.8452] device (tap724cab50-b0): carrier: link connected
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.851 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f7fcca-7133-4969-9bf3-b03a85f9f054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.870 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b30e012e-21ea-4ec5-ab5a-a67cbb6f6001]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap724cab50-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:61:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602046, 'reachable_time': 30637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250943, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.885 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e88aa32f-b870-48cf-b93d-476e769df8b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:61a4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602046, 'tstamp': 602046}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250944, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.898 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b3480b-cf6c-4491-b75a-88bd63d2d8a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap724cab50-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:61:a4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 227], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602046, 'reachable_time': 30637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250945, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:24 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:24.926 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e0f6ac74-1eb9-4ab8-bd66-ecd0f7ccd1ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.009 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed2772a-28b0-4a2f-aaaf-eadf99eeb07c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.010 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap724cab50-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.010 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.011 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap724cab50-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:25 compute-0 NetworkManager[51733]: <info>  [1759269325.0134] manager: (tap724cab50-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:25 compute-0 kernel: tap724cab50-b0: entered promiscuous mode
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.016 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap724cab50-b0, col_values=(('external_ids', {'iface-id': '4630c540-3db6-41e7-8e0e-b05a231d7e73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:25 compute-0 ovn_controller[94912]: 2025-09-30T21:55:25Z|00765|binding|INFO|Releasing lport 4630c540-3db6-41e7-8e0e-b05a231d7e73 from this chassis (sb_readonly=0)
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.030 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.031 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[35e08b25-c0f7-47d8-9857-6a7ddd94d4f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.032 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/724cab50-b368-4e40-a600-414c68f09e7c.pid.haproxy
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 724cab50-b368-4e40-a600-414c68f09e7c
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:55:25 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:25.033 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'env', 'PROCESS_TAG=haproxy-724cab50-b368-4e40-a600-414c68f09e7c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/724cab50-b368-4e40-a600-414c68f09e7c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:55:25 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:25 compute-0 podman[250984]: 2025-09-30 21:55:25.39934241 +0000 UTC m=+0.060574726 container create 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:55:25 compute-0 systemd[1]: Started libpod-conmon-83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b.scope.
Sep 30 21:55:25 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:55:25 compute-0 podman[250984]: 2025-09-30 21:55:25.361982463 +0000 UTC m=+0.023214809 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:55:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f46767d95e87b0c26b024b968c51379797301e9970f12f17c1e4192d8ba6cf8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:55:25 compute-0 podman[250984]: 2025-09-30 21:55:25.482333045 +0000 UTC m=+0.143565351 container init 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:55:25 compute-0 podman[250984]: 2025-09-30 21:55:25.487637589 +0000 UTC m=+0.148869895 container start 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 21:55:25 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [NOTICE]   (251004) : New worker (251006) forked
Sep 30 21:55:25 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [NOTICE]   (251004) : Loading success.
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.522 2 DEBUG nova.virt.libvirt.host [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Removed pending event for 0decb8c3-82ea-4251-8377-e18a207b6093 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.523 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269325.52204, 0decb8c3-82ea-4251-8377-e18a207b6093 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.523 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Started (Lifecycle Event)
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.547 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.551 2 DEBUG nova.compute.manager [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.551 2 DEBUG nova.objects.instance [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.554 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.576 2 INFO nova.virt.libvirt.driver [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance running successfully.
Sep 30 21:55:25 compute-0 virtqemud[192233]: argument unsupported: QEMU guest agent is not configured
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.578 2 DEBUG nova.virt.libvirt.guest [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.579 2 DEBUG nova.compute.manager [None req-0c47dbde-6308-49f8-a64a-9835561fa0f3 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.580 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.580 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269325.5305204, 0decb8c3-82ea-4251-8377-e18a207b6093 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.580 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Resumed (Lifecycle Event)
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.612 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.615 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:55:25 compute-0 nova_compute[192810]: 2025-09-30 21:55:25.636 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] During sync_power_state the instance has a pending task (resuming). Skip.
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.492 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.492 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.513 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.613 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.613 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.619 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.620 2 INFO nova.compute.claims [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Claim successful on node compute-0.ctlplane.example.com
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.739 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.740 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.740 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.740 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.740 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.740 2 WARNING nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state active and task_state None.
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.741 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.741 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.741 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.741 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.741 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.742 2 WARNING nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state active and task_state None.
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.742 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.742 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.742 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.742 2 DEBUG oslo_concurrency.lockutils [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.743 2 DEBUG nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.743 2 WARNING nova.compute.manager [req-a12fa60e-8c92-4679-be6a-071b90cd3ad5 req-baaa55cb-47ec-4e1b-a197-9cee1366fb33 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state active and task_state None.
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.777 2 DEBUG nova.compute.provider_tree [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.790 2 DEBUG nova.scheduler.client.report [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.809 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.810 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.861 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.861 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.877 2 INFO nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 21:55:26 compute-0 nova_compute[192810]: 2025-09-30 21:55:26.897 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.020 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.021 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.022 2 INFO nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Creating image(s)
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.022 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.023 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.023 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.024 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "534805cc590496cbc5396fb5b9746d26209aed69" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.024 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "534805cc590496cbc5396fb5b9746d26209aed69" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:27 compute-0 nova_compute[192810]: 2025-09-30 21:55:27.051 2 DEBUG nova.policy [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd6d7afba807d47549781e37178a01774', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2c2e514e7322435988a7f3bf398623e4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Sep 30 21:55:27 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.496 2 INFO nova.compute.manager [None req-b730bea7-ab80-4b7d-8f13-09d19dcff405 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Get console output
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.503 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.719 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.774 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.part --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.775 2 DEBUG nova.virt.images [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] 0dfd446c-9899-4202-86e2-c09f7b33b375 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.776 2 DEBUG nova.privsep.utils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:55:28 compute-0 nova_compute[192810]: 2025-09-30 21:55:28.778 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.part /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.187 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Successfully created port: 4427a885-e447-4fc0-92ac-c64a4d480c16 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.191 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.part /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.converted" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.200 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.313 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69.converted --force-share --output=json" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.314 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "534805cc590496cbc5396fb5b9746d26209aed69" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.327 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 podman[251027]: 2025-09-30 21:55:29.335935431 +0000 UTC m=+0.067458721 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Sep 30 21:55:29 compute-0 podman[251028]: 2025-09-30 21:55:29.373405421 +0000 UTC m=+0.096948429 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.schema-version=1.0)
Sep 30 21:55:29 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:29 compute-0 podman[251026]: 2025-09-30 21:55:29.39188016 +0000 UTC m=+0.123008110 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_controller)
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.391 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.392 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "534805cc590496cbc5396fb5b9746d26209aed69" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.393 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "534805cc590496cbc5396fb5b9746d26209aed69" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.404 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.468 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.469 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69,backing_fmt=raw /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.503 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69,backing_fmt=raw /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.504 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "534805cc590496cbc5396fb5b9746d26209aed69" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.505 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.560 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.561 2 DEBUG nova.objects.instance [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'migration_context' on Instance uuid 0139492c-c61e-4cd8-b789-64e00c3d8ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.573 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.574 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Ensure instance console log exists: /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.574 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.574 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.575 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.771 2 DEBUG nova.compute.manager [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.771 2 DEBUG nova.compute.manager [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing instance network info cache due to event network-changed-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.771 2 DEBUG oslo_concurrency.lockutils [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.772 2 DEBUG oslo_concurrency.lockutils [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.772 2 DEBUG nova.network.neutron [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Refreshing network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.858 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.858 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.858 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.859 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.859 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.873 2 INFO nova.compute.manager [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Terminating instance
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.884 2 DEBUG nova.compute.manager [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:55:29 compute-0 kernel: tapebfc0c9a-67 (unregistering): left promiscuous mode
Sep 30 21:55:29 compute-0 NetworkManager[51733]: <info>  [1759269329.9077] device (tapebfc0c9a-67): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:55:29 compute-0 ovn_controller[94912]: 2025-09-30T21:55:29Z|00766|binding|INFO|Releasing lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 from this chassis (sb_readonly=0)
Sep 30 21:55:29 compute-0 ovn_controller[94912]: 2025-09-30T21:55:29Z|00767|binding|INFO|Setting lport ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 down in Southbound
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:29 compute-0 ovn_controller[94912]: 2025-09-30T21:55:29Z|00768|binding|INFO|Removing iface tapebfc0c9a-67 ovn-installed in OVS
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:29.925 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:c3:46 10.100.0.7'], port_security=['fa:16:3e:24:c3:46 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0decb8c3-82ea-4251-8377-e18a207b6093', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-724cab50-b368-4e40-a600-414c68f09e7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '075b1efc4c8e4cb1b28d61b042c451e9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'cc0e7b48-5c0d-472b-a638-38ad483cbcca', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64928777-2ee7-41ca-81bb-a7afc7a99c4d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:29.926 103867 INFO neutron.agent.ovn.metadata.agent [-] Port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 in datapath 724cab50-b368-4e40-a600-414c68f09e7c unbound from our chassis
Sep 30 21:55:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:29.927 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 724cab50-b368-4e40-a600-414c68f09e7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:55:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:29.928 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[6d683296-f233-4cab-b20b-62f3395c4059]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:29 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:29.929 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c namespace which is not needed anymore
Sep 30 21:55:29 compute-0 nova_compute[192810]: 2025-09-30 21:55:29.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:29 compute-0 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000b7.scope: Deactivated successfully.
Sep 30 21:55:29 compute-0 systemd-machined[152794]: Machine qemu-90-instance-000000b7 terminated.
Sep 30 21:55:30 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [NOTICE]   (251004) : haproxy version is 2.8.14-c23fe91
Sep 30 21:55:30 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [NOTICE]   (251004) : path to executable is /usr/sbin/haproxy
Sep 30 21:55:30 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [WARNING]  (251004) : Exiting Master process...
Sep 30 21:55:30 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [ALERT]    (251004) : Current worker (251006) exited with code 143 (Terminated)
Sep 30 21:55:30 compute-0 neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c[251000]: [WARNING]  (251004) : All workers exited. Exiting... (0)
Sep 30 21:55:30 compute-0 systemd[1]: libpod-83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b.scope: Deactivated successfully.
Sep 30 21:55:30 compute-0 podman[251126]: 2025-09-30 21:55:30.055201538 +0000 UTC m=+0.041057092 container died 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 21:55:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b-userdata-shm.mount: Deactivated successfully.
Sep 30 21:55:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-f46767d95e87b0c26b024b968c51379797301e9970f12f17c1e4192d8ba6cf8a-merged.mount: Deactivated successfully.
Sep 30 21:55:30 compute-0 podman[251126]: 2025-09-30 21:55:30.090516284 +0000 UTC m=+0.076371838 container cleanup 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:55:30 compute-0 systemd[1]: libpod-conmon-83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b.scope: Deactivated successfully.
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.141 2 INFO nova.virt.libvirt.driver [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Instance destroyed successfully.
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.142 2 DEBUG nova.objects.instance [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lazy-loading 'resources' on Instance uuid 0decb8c3-82ea-4251-8377-e18a207b6093 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.146 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Successfully updated port: 4427a885-e447-4fc0-92ac-c64a4d480c16 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Sep 30 21:55:30 compute-0 podman[251158]: 2025-09-30 21:55:30.149286164 +0000 UTC m=+0.038130438 container remove 83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.153 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1f6e7ddd-a0a4-4fc6-9a95-34e70cb64ba8]: (4, ('Tue Sep 30 09:55:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c (83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b)\n83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b\nTue Sep 30 09:55:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c (83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b)\n83c775d22eb41fdc3334ea281ac85be08e667e3a337ff8e252bd080ad614d54b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.155 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[527d58a7-16ee-4e4b-82f8-c4d3bf56428d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.156 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap724cab50-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:30 compute-0 kernel: tap724cab50-b0: left promiscuous mode
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.159 2 DEBUG nova.virt.libvirt.vif [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:54:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1048714053',display_name='tempest-TestNetworkAdvancedServerOps-server-1048714053',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1048714053',id=183,image_ref='86b6907c-d747-4e98-8897-42105915831d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJausXmCoycNdXcrwJ/z2+iqGqSQ4qE5NNNaxnUrCp8xA543XlWTkIjz2Q7FVEsF1RcAtqLbp9Nk3Qb/yWSMZn8lsL4y/ez1BlHhptUsfXY/Ihxv4Dxwf0YYEI+dJFl0qA==',key_name='tempest-TestNetworkAdvancedServerOps-1891871344',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:54:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='075b1efc4c8e4cb1b28d61b042c451e9',ramdisk_id='',reservation_id='r-hg1e5fd1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-374190229',owner_user_name='tempest-TestNetworkAdvancedServerOps-374190229-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:55:25Z,user_data=None,user_id='185cc8ad7e1445d2ab5006153ab19700',uuid=0decb8c3-82ea-4251-8377-e18a207b6093,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.159 2 DEBUG nova.network.os_vif_util [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converting VIF {"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.160 2 DEBUG nova.network.os_vif_util [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.160 2 DEBUG os_vif [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.164 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebfc0c9a-67, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.173 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.173 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquired lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.173 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.174 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[5cdcc218-a37d-4777-8a56-d5af137151a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.176 2 INFO os_vif [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:c3:46,bridge_name='br-int',has_traffic_filtering=True,id=ebfc0c9a-67e0-40a1-abf3-105c7c3435b7,network=Network(724cab50-b368-4e40-a600-414c68f09e7c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebfc0c9a-67')
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.177 2 INFO nova.virt.libvirt.driver [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Deleting instance files /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093_del
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.178 2 INFO nova.virt.libvirt.driver [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Deletion of /var/lib/nova/instances/0decb8c3-82ea-4251-8377-e18a207b6093_del complete
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.193 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f55c3c-9c78-48f5-9466-f0e0de343dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.195 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d637bcfa-3ae2-42b5-86d8-9319747cc871]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.208 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a784d2ac-f74f-4f12-9451-083d581146d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602037, 'reachable_time': 39252, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251189, 'error': None, 'target': 'ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.209 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-724cab50-b368-4e40-a600-414c68f09e7c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:55:30 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:30.209 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[c9cdb90b-a018-4da4-aaef-13737960b219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d724cab50\x2db368\x2d4e40\x2da600\x2d414c68f09e7c.mount: Deactivated successfully.
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.236 2 INFO nova.compute.manager [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Took 0.35 seconds to destroy the instance on the hypervisor.
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.237 2 DEBUG oslo.service.loopingcall [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.237 2 DEBUG nova.compute.manager [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.237 2 DEBUG nova.network.neutron [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:55:30 compute-0 nova_compute[192810]: 2025-09-30 21:55:30.388 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.252 2 DEBUG nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.253 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.253 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.254 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.254 2 DEBUG nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.255 2 DEBUG nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-unplugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.255 2 DEBUG nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.256 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.256 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.256 2 DEBUG oslo_concurrency.lockutils [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.257 2 DEBUG nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] No waiting events found dispatching network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.257 2 WARNING nova.compute.manager [req-b7f72b63-4514-4a97-9554-1c59e2036441 req-52fef03b-e276-477a-b943-1e0808241aff dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received unexpected event network-vif-plugged-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 for instance with vm_state active and task_state deleting.
Sep 30 21:55:31 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.663 2 DEBUG nova.network.neutron [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.681 2 INFO nova.compute.manager [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Took 1.44 seconds to deallocate network for instance.
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.766 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.767 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.854 2 DEBUG nova.compute.provider_tree [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.870 2 DEBUG nova.scheduler.client.report [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.883 2 DEBUG nova.compute.manager [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.883 2 DEBUG nova.compute.manager [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing instance network info cache due to event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.883 2 DEBUG oslo_concurrency.lockutils [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.888 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.917 2 INFO nova.scheduler.client.report [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Deleted allocations for instance 0decb8c3-82ea-4251-8377-e18a207b6093
Sep 30 21:55:31 compute-0 nova_compute[192810]: 2025-09-30 21:55:31.994 2 DEBUG oslo_concurrency.lockutils [None req-c3cdc6e6-6356-4939-96bb-fb56b4afd2b4 185cc8ad7e1445d2ab5006153ab19700 075b1efc4c8e4cb1b28d61b042c451e9 - - default default] Lock "0decb8c3-82ea-4251-8377-e18a207b6093" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.376 2 DEBUG nova.network.neutron [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.380 2 DEBUG nova.network.neutron [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updated VIF entry in instance network info cache for port ebfc0c9a-67e0-40a1-abf3-105c7c3435b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.381 2 DEBUG nova.network.neutron [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Updating instance_info_cache with network_info: [{"id": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "address": "fa:16:3e:24:c3:46", "network": {"id": "724cab50-b368-4e40-a600-414c68f09e7c", "bridge": "br-int", "label": "tempest-network-smoke--1524279325", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "075b1efc4c8e4cb1b28d61b042c451e9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebfc0c9a-67", "ovs_interfaceid": "ebfc0c9a-67e0-40a1-abf3-105c7c3435b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.403 2 DEBUG oslo_concurrency.lockutils [req-1a044343-9dd1-41d1-bd96-341683f0cd39 req-a4aa932c-c486-4a42-b75c-58d1f0cf392b dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0decb8c3-82ea-4251-8377-e18a207b6093" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.406 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Releasing lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.406 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Instance network_info: |[{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.406 2 DEBUG oslo_concurrency.lockutils [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.406 2 DEBUG nova.network.neutron [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.408 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Start _get_guest_xml network_info=[{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='3fc6ed4a50310e0751c0a03a45d6514d',container_format='bare',created_at=2025-09-30T21:55:16Z,direct_url=<?>,disk_format='qcow2',id=0dfd446c-9899-4202-86e2-c09f7b33b375,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-1030804312',owner='2c2e514e7322435988a7f3bf398623e4',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-09-30T21:55:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'guest_format': None, 'encrypted': False, 'size': 0, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'image_id': '0dfd446c-9899-4202-86e2-c09f7b33b375'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.413 2 WARNING nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.418 2 DEBUG nova.virt.libvirt.host [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.418 2 DEBUG nova.virt.libvirt.host [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.422 2 DEBUG nova.virt.libvirt.host [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.422 2 DEBUG nova.virt.libvirt.host [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.423 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.423 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T21:15:04Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='afe5c12d-500a-499b-9438-9e9c37698acc',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='3fc6ed4a50310e0751c0a03a45d6514d',container_format='bare',created_at=2025-09-30T21:55:16Z,direct_url=<?>,disk_format='qcow2',id=0dfd446c-9899-4202-86e2-c09f7b33b375,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-1030804312',owner='2c2e514e7322435988a7f3bf398623e4',properties=ImageMetaProps,protected=<?>,size=52297728,status='active',tags=<?>,updated_at=2025-09-30T21:55:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.424 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.424 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.424 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.424 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.424 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.425 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.425 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.425 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.425 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.425 2 DEBUG nova.virt.hardware [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.428 2 DEBUG nova.virt.libvirt.vif [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:55:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1448730658',display_name='tempest-TestSnapshotPattern-server-1448730658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1448730658',id=185,image_ref='0dfd446c-9899-4202-86e2-c09f7b33b375',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-fn3p0bkg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='70bc4ef2-d80e-49af-b20a-6240f762b81e',image_min_disk='1',image_min_ram='0',image_owner_id='2c2e514e7322435988a7f3bf398623e4',image_owner_project_name='tempest-TestSnapshotPattern-1968938915',image_owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member',image_user_id='d6d7afba807d47549781e37178a01774',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:55:26Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=0139492c-c61e-4cd8-b789-64e00c3d8ec8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.428 2 DEBUG nova.network.os_vif_util [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.429 2 DEBUG nova.network.os_vif_util [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.430 2 DEBUG nova.objects.instance [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0139492c-c61e-4cd8-b789-64e00c3d8ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.441 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] End _get_guest_xml xml=<domain type="kvm">
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <uuid>0139492c-c61e-4cd8-b789-64e00c3d8ec8</uuid>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <name>instance-000000b9</name>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <memory>131072</memory>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <vcpu>1</vcpu>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <metadata>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:name>tempest-TestSnapshotPattern-server-1448730658</nova:name>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:creationTime>2025-09-30 21:55:32</nova:creationTime>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:flavor name="m1.nano">
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:memory>128</nova:memory>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:disk>1</nova:disk>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:swap>0</nova:swap>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:vcpus>1</nova:vcpus>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       </nova:flavor>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:owner>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:user uuid="d6d7afba807d47549781e37178a01774">tempest-TestSnapshotPattern-1968938915-project-member</nova:user>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:project uuid="2c2e514e7322435988a7f3bf398623e4">tempest-TestSnapshotPattern-1968938915</nova:project>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       </nova:owner>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:root type="image" uuid="0dfd446c-9899-4202-86e2-c09f7b33b375"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <nova:ports>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         <nova:port uuid="4427a885-e447-4fc0-92ac-c64a4d480c16">
Sep 30 21:55:32 compute-0 nova_compute[192810]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:         </nova:port>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       </nova:ports>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </nova:instance>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </metadata>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <sysinfo type="smbios">
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <system>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="manufacturer">RDO</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="product">OpenStack Compute</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="serial">0139492c-c61e-4cd8-b789-64e00c3d8ec8</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="uuid">0139492c-c61e-4cd8-b789-64e00c3d8ec8</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <entry name="family">Virtual Machine</entry>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </system>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </sysinfo>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <os>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <boot dev="hd"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <smbios mode="sysinfo"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </os>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <features>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <acpi/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <apic/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <vmcoreinfo/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </features>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <clock offset="utc">
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <timer name="hpet" present="no"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </clock>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <cpu mode="custom" match="exact">
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <model>Nehalem</model>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </cpu>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   <devices>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <disk type="file" device="disk">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <target dev="vda" bus="virtio"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <disk type="file" device="cdrom">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <source file="/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.config"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <target dev="sda" bus="sata"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </disk>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <interface type="ethernet">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <mac address="fa:16:3e:cf:d8:9d"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <mtu size="1442"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <target dev="tap4427a885-e4"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </interface>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <serial type="pty">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <log file="/var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/console.log" append="off"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </serial>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <video>
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <model type="virtio"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </video>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <input type="tablet" bus="usb"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <input type="keyboard" bus="usb"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <rng model="virtio">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <backend model="random">/dev/urandom</backend>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </rng>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <controller type="usb" index="0"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     <memballoon model="virtio">
Sep 30 21:55:32 compute-0 nova_compute[192810]:       <stats period="10"/>
Sep 30 21:55:32 compute-0 nova_compute[192810]:     </memballoon>
Sep 30 21:55:32 compute-0 nova_compute[192810]:   </devices>
Sep 30 21:55:32 compute-0 nova_compute[192810]: </domain>
Sep 30 21:55:32 compute-0 nova_compute[192810]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.442 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Preparing to wait for external event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.442 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.443 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.443 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.443 2 DEBUG nova.virt.libvirt.vif [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-09-30T21:55:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1448730658',display_name='tempest-TestSnapshotPattern-server-1448730658',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1448730658',id=185,image_ref='0dfd446c-9899-4202-86e2-c09f7b33b375',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-fn3p0bkg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='70bc4ef2-d80e-49af-b20a-6240f762b81e',image_min_disk='1',image_min_ram='0',image_owner_id='2c2e514e7322435988a7f3bf398623e4',image_owner_project_name='tempest-TestSnapshotPattern-1968938915',image_owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member',image_user_id='d6d7afba807d47549781e37178a01774',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T21:55:26Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=0139492c-c61e-4cd8-b789-64e00c3d8ec8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.444 2 DEBUG nova.network.os_vif_util [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.444 2 DEBUG nova.network.os_vif_util [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.444 2 DEBUG os_vif [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4427a885-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4427a885-e4, col_values=(('external_ids', {'iface-id': '4427a885-e447-4fc0-92ac-c64a4d480c16', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cf:d8:9d', 'vm-uuid': '0139492c-c61e-4cd8-b789-64e00c3d8ec8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:32 compute-0 NetworkManager[51733]: <info>  [1759269332.4509] manager: (tap4427a885-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/339)
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.455 2 INFO os_vif [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4')
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.515 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.516 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.516 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] No VIF found with MAC fa:16:3e:cf:d8:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Sep 30 21:55:32 compute-0 nova_compute[192810]: 2025-09-30 21:55:32.517 2 INFO nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Using config drive
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.379 2 DEBUG nova.compute.manager [req-2b673025-f7ba-42c9-886c-cbe5ab12511e req-0d2efc31-0edb-456c-87b1-93ccc534b435 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Received event network-vif-deleted-ebfc0c9a-67e0-40a1-abf3-105c7c3435b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.393 2 INFO nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Creating config drive at /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.config
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.398 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplwudq_d0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.527 2 DEBUG oslo_concurrency.processutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplwudq_d0" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:55:33 compute-0 kernel: tap4427a885-e4: entered promiscuous mode
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.5776] manager: (tap4427a885-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/340)
Sep 30 21:55:33 compute-0 ovn_controller[94912]: 2025-09-30T21:55:33Z|00769|binding|INFO|Claiming lport 4427a885-e447-4fc0-92ac-c64a4d480c16 for this chassis.
Sep 30 21:55:33 compute-0 ovn_controller[94912]: 2025-09-30T21:55:33Z|00770|binding|INFO|4427a885-e447-4fc0-92ac-c64a4d480c16: Claiming fa:16:3e:cf:d8:9d 10.100.0.14
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.587 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:d8:9d 10.100.0.14'], port_security=['fa:16:3e:cf:d8:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0139492c-c61e-4cd8-b789-64e00c3d8ec8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36195885-e54a-4c05-b721-98be98333841', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c2e514e7322435988a7f3bf398623e4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '98fe3b07-2473-4235-becf-2f443e01bab9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d62c416-32f7-4ff1-bc52-93428ed2b707, chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=4427a885-e447-4fc0-92ac-c64a4d480c16) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.588 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 4427a885-e447-4fc0-92ac-c64a4d480c16 in datapath 36195885-e54a-4c05-b721-98be98333841 bound to our chassis
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.590 103867 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 36195885-e54a-4c05-b721-98be98333841
Sep 30 21:55:33 compute-0 ovn_controller[94912]: 2025-09-30T21:55:33Z|00771|binding|INFO|Setting lport 4427a885-e447-4fc0-92ac-c64a4d480c16 ovn-installed in OVS
Sep 30 21:55:33 compute-0 ovn_controller[94912]: 2025-09-30T21:55:33Z|00772|binding|INFO|Setting lport 4427a885-e447-4fc0-92ac-c64a4d480c16 up in Southbound
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.602 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[881d1988-46f4-4737-b2a3-7cfcad9f9750]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.603 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap36195885-e1 in ovnmeta-36195885-e54a-4c05-b721-98be98333841 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.604 220624 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap36195885-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.605 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[e728acf9-357e-4da2-aed1-457bbc42a3e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.606 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[a537f011-5a5e-474a-8462-98ef48581c76]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 systemd-udevd[251212]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 21:55:33 compute-0 systemd-machined[152794]: New machine qemu-91-instance-000000b9.
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.616 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8ba8ca-ba62-4a06-9219-c1f6daa710b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.6246] device (tap4427a885-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.6255] device (tap4427a885-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 21:55:33 compute-0 systemd[1]: Started Virtual Machine qemu-91-instance-000000b9.
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.639 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[addb3fb1-fe1a-47ca-befa-7a727bde44fd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.669 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[1f2a9594-569c-45a4-9675-d82649aa30be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.673 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[63878b26-7c94-4eb5-a594-42bc3d58cbeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.6742] manager: (tap36195885-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/341)
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.707 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b5f5e4-8af0-4ee5-abd5-56000ee0aaa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.712 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[a4977a29-ba17-43f9-8ea9-8a6c03e0d960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.7361] device (tap36195885-e0): carrier: link connected
Sep 30 21:55:33 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.742 220638 DEBUG oslo.privsep.daemon [-] privsep: reply[17428076-8d7e-4a2e-93a5-9b9203998673]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.759 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[f91c6638-4988-4939-9265-5dae624dadc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36195885-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:d5:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602936, 'reachable_time': 44302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251244, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.775 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ff757d57-a69a-481c-9de8-c50c2364d48a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:d5fa'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 602936, 'tstamp': 602936}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251245, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.790 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[141d4c57-b2d8-4cfe-ac7b-48b6701537b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap36195885-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:d5:fa'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 230], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602936, 'reachable_time': 44302, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251246, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.820 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[ebb78a4e-c334-48c2-ad8f-e108aab67a41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.873 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9df91f0c-ab77-4b27-b747-c43a6ee3e0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.875 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36195885-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.875 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.875 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36195885-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:33 compute-0 NetworkManager[51733]: <info>  [1759269333.8779] manager: (tap36195885-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 kernel: tap36195885-e0: entered promiscuous mode
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.881 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap36195885-e0, col_values=(('external_ids', {'iface-id': '4398875c-5162-48bf-962e-225588b9b8b0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:33 compute-0 ovn_controller[94912]: 2025-09-30T21:55:33Z|00773|binding|INFO|Releasing lport 4398875c-5162-48bf-962e-225588b9b8b0 from this chassis (sb_readonly=0)
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.887 103867 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.888 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ebae19-4921-4888-8349-d7ee5fcb98f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.890 103867 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: global
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     log         /dev/log local0 debug
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     log-tag     haproxy-metadata-proxy-36195885-e54a-4c05-b721-98be98333841
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     user        root
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     group       root
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     maxconn     1024
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     pidfile     /var/lib/neutron/external/pids/36195885-e54a-4c05-b721-98be98333841.pid.haproxy
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     daemon
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: defaults
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     log global
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     mode http
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     option httplog
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     option dontlognull
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     option http-server-close
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     option forwardfor
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     retries                 3
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     timeout http-request    30s
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     timeout connect         30s
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     timeout client          32s
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     timeout server          32s
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     timeout http-keep-alive 30s
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: listen listener
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     bind 169.254.169.254:80
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:     http-request add-header X-OVN-Network-ID 36195885-e54a-4c05-b721-98be98333841
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Sep 30 21:55:33 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:33.891 103867 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'env', 'PROCESS_TAG=haproxy-36195885-e54a-4c05-b721-98be98333841', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/36195885-e54a-4c05-b721-98be98333841.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Sep 30 21:55:33 compute-0 nova_compute[192810]: 2025-09-30 21:55:33.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:34 compute-0 podman[251285]: 2025-09-30 21:55:34.269485658 +0000 UTC m=+0.048838608 container create d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:55:34 compute-0 systemd[1]: Started libpod-conmon-d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8.scope.
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.304 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269334.303787, 0139492c-c61e-4cd8-b789-64e00c3d8ec8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.305 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] VM Started (Lifecycle Event)
Sep 30 21:55:34 compute-0 systemd[1]: Started libcrun container.
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.326 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.329 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269334.3039036, 0139492c-c61e-4cd8-b789-64e00c3d8ec8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.330 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] VM Paused (Lifecycle Event)
Sep 30 21:55:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02de7b089857629fed75727bad4e0cbf996ab16b57a2ab08ddd1abe9ce02cc39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 21:55:34 compute-0 podman[251285]: 2025-09-30 21:55:34.342003637 +0000 UTC m=+0.121356617 container init d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923)
Sep 30 21:55:34 compute-0 podman[251285]: 2025-09-30 21:55:34.247784778 +0000 UTC m=+0.027137728 image pull aa21cc3d2531fe07b45a943d4ac1ba0268bfab26b0884a4a00fbad7695318ba9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Sep 30 21:55:34 compute-0 podman[251285]: 2025-09-30 21:55:34.346939482 +0000 UTC m=+0.126292452 container start d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.354 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.358 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:55:34 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [NOTICE]   (251304) : New worker (251306) forked
Sep 30 21:55:34 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [NOTICE]   (251304) : Loading success.
Sep 30 21:55:34 compute-0 nova_compute[192810]: 2025-09-30 21:55:34.379 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.312 2 DEBUG nova.network.neutron [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updated VIF entry in instance network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.313 2 DEBUG nova.network.neutron [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.327 2 DEBUG oslo_concurrency.lockutils [req-a321710b-9a44-420b-be96-84783b638345 req-02c66985-8de4-4d9b-bfa2-3ea427ce7808 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.468 2 DEBUG nova.compute.manager [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.468 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.468 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.469 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.469 2 DEBUG nova.compute.manager [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Processing event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.469 2 DEBUG nova.compute.manager [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.469 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.469 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.470 2 DEBUG oslo_concurrency.lockutils [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.470 2 DEBUG nova.compute.manager [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] No waiting events found dispatching network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.470 2 WARNING nova.compute.manager [req-819947d7-3c7f-4929-be1e-f22ce15cc4f0 req-a23afef3-ae57-4221-89e9-0ae8870bcac8 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received unexpected event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 for instance with vm_state building and task_state spawning.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.471 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.473 2 DEBUG nova.virt.driver [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] Emitting event <LifecycleEvent: 1759269335.4734626, 0139492c-c61e-4cd8-b789-64e00c3d8ec8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.473 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] VM Resumed (Lifecycle Event)
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.475 2 DEBUG nova.virt.libvirt.driver [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.478 2 INFO nova.virt.libvirt.driver [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Instance spawned successfully.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.478 2 INFO nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Took 8.46 seconds to spawn the instance on the hypervisor.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.479 2 DEBUG nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.492 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.495 2 DEBUG nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.526 2 INFO nova.compute.manager [None req-a69cff4d-b01c-4db2-aa9b-b284426e4eb4 - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] During sync_power_state the instance has a pending task (spawning). Skip.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.570 2 INFO nova.compute.manager [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Took 9.00 seconds to build instance.
Sep 30 21:55:35 compute-0 nova_compute[192810]: 2025-09-30 21:55:35.591 2 DEBUG oslo_concurrency.lockutils [None req-3e8e2e0a-b1bf-4a35-a336-72f14b33ca3c d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:35 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:37.334 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:55:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:37.335 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:55:37 compute-0 nova_compute[192810]: 2025-09-30 21:55:37.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:37 compute-0 nova_compute[192810]: 2025-09-30 21:55:37.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:37 compute-0 ovn_controller[94912]: 2025-09-30T21:55:37Z|00774|binding|INFO|Releasing lport 4398875c-5162-48bf-962e-225588b9b8b0 from this chassis (sb_readonly=0)
Sep 30 21:55:37 compute-0 nova_compute[192810]: 2025-09-30 21:55:37.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:37 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:38 compute-0 nova_compute[192810]: 2025-09-30 21:55:38.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:38.760 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:55:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:38.761 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:55:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:38.761 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:55:39 compute-0 podman[251315]: 2025-09-30 21:55:39.341650471 +0000 UTC m=+0.074184492 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 21:55:39 compute-0 podman[251316]: 2025-09-30 21:55:39.349712075 +0000 UTC m=+0.076885860 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6)
Sep 30 21:55:39 compute-0 sshd-session[250893]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:40 compute-0 nova_compute[192810]: 2025-09-30 21:55:40.443 2 DEBUG nova.compute.manager [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:55:40 compute-0 nova_compute[192810]: 2025-09-30 21:55:40.443 2 DEBUG nova.compute.manager [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing instance network info cache due to event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:55:40 compute-0 nova_compute[192810]: 2025-09-30 21:55:40.443 2 DEBUG oslo_concurrency.lockutils [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:55:40 compute-0 nova_compute[192810]: 2025-09-30 21:55:40.444 2 DEBUG oslo_concurrency.lockutils [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:55:40 compute-0 nova_compute[192810]: 2025-09-30 21:55:40.444 2 DEBUG nova.network.neutron [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:55:41 compute-0 sshd-session[250893]: Failed password for invalid user ubuntu from 8.210.178.40 port 35696 ssh2
Sep 30 21:55:42 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:55:42.337 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:55:42 compute-0 nova_compute[192810]: 2025-09-30 21:55:42.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:42 compute-0 nova_compute[192810]: 2025-09-30 21:55:42.466 2 DEBUG nova.network.neutron [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updated VIF entry in instance network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:55:42 compute-0 nova_compute[192810]: 2025-09-30 21:55:42.467 2 DEBUG nova.network.neutron [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:55:42 compute-0 nova_compute[192810]: 2025-09-30 21:55:42.492 2 DEBUG oslo_concurrency.lockutils [req-4e24e3ab-8c14-49ec-a6ac-a88d85509397 req-1401c7ee-9b4a-4ce0-8e99-ba7d04eba00d dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:55:43 compute-0 nova_compute[192810]: 2025-09-30 21:55:43.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:43 compute-0 sshd-session[250893]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 35696 ssh2 [preauth]
Sep 30 21:55:43 compute-0 sshd-session[250893]: Disconnecting invalid user ubuntu 8.210.178.40 port 35696: Too many authentication failures [preauth]
Sep 30 21:55:43 compute-0 sshd-session[250893]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:43 compute-0 sshd-session[250893]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:55:45 compute-0 nova_compute[192810]: 2025-09-30 21:55:45.140 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269330.1397686, 0decb8c3-82ea-4251-8377-e18a207b6093 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:55:45 compute-0 nova_compute[192810]: 2025-09-30 21:55:45.141 2 INFO nova.compute.manager [-] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] VM Stopped (Lifecycle Event)
Sep 30 21:55:45 compute-0 nova_compute[192810]: 2025-09-30 21:55:45.169 2 DEBUG nova.compute.manager [None req-56c9afcd-5956-4e20-9009-6212217756a8 - - - - - -] [instance: 0decb8c3-82ea-4251-8377-e18a207b6093] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:55:45 compute-0 sshd-session[251361]: Invalid user ubuntu from 8.210.178.40 port 36462
Sep 30 21:55:45 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:45 compute-0 sshd-session[251361]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:55:46 compute-0 podman[251363]: 2025-09-30 21:55:46.32381558 +0000 UTC m=+0.060336761 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 21:55:46 compute-0 podman[251364]: 2025-09-30 21:55:46.335192759 +0000 UTC m=+0.071862083 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:55:46 compute-0 podman[251365]: 2025-09-30 21:55:46.356342245 +0000 UTC m=+0.078512512 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 21:55:46 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:55:47 compute-0 nova_compute[192810]: 2025-09-30 21:55:47.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:47 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:47 compute-0 ovn_controller[94912]: 2025-09-30T21:55:47Z|00086|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.5 does not match offer 10.100.0.14
Sep 30 21:55:47 compute-0 ovn_controller[94912]: 2025-09-30T21:55:47Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cf:d8:9d 10.100.0.14
Sep 30 21:55:48 compute-0 nova_compute[192810]: 2025-09-30 21:55:48.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:49 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:55:51 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:52 compute-0 nova_compute[192810]: 2025-09-30 21:55:52.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:52 compute-0 ovn_controller[94912]: 2025-09-30T21:55:52Z|00088|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.5 does not match offer 10.100.0.14
Sep 30 21:55:52 compute-0 ovn_controller[94912]: 2025-09-30T21:55:52Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:cf:d8:9d 10.100.0.14
Sep 30 21:55:52 compute-0 ovn_controller[94912]: 2025-09-30T21:55:52Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cf:d8:9d 10.100.0.14
Sep 30 21:55:52 compute-0 ovn_controller[94912]: 2025-09-30T21:55:52Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cf:d8:9d 10.100.0.14
Sep 30 21:55:53 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:55:53 compute-0 nova_compute[192810]: 2025-09-30 21:55:53.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:53 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:56 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:55:57 compute-0 nova_compute[192810]: 2025-09-30 21:55:57.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:58 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:55:58 compute-0 nova_compute[192810]: 2025-09-30 21:55:58.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:55:59 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:56:00 compute-0 sshd-session[251361]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:56:00 compute-0 podman[251435]: 2025-09-30 21:56:00.309293058 +0000 UTC m=+0.048302995 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Sep 30 21:56:00 compute-0 podman[251436]: 2025-09-30 21:56:00.317165278 +0000 UTC m=+0.053690392 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Sep 30 21:56:00 compute-0 podman[251434]: 2025-09-30 21:56:00.329981273 +0000 UTC m=+0.071346980 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:56:02 compute-0 sshd-session[251361]: Failed password for invalid user ubuntu from 8.210.178.40 port 36462 ssh2
Sep 30 21:56:02 compute-0 nova_compute[192810]: 2025-09-30 21:56:02.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:03 compute-0 nova_compute[192810]: 2025-09-30 21:56:03.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:04 compute-0 sshd-session[251361]: error: maximum authentication attempts exceeded for invalid user ubuntu from 8.210.178.40 port 36462 ssh2 [preauth]
Sep 30 21:56:04 compute-0 sshd-session[251361]: Disconnecting invalid user ubuntu 8.210.178.40 port 36462: Too many authentication failures [preauth]
Sep 30 21:56:04 compute-0 sshd-session[251361]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:56:04 compute-0 sshd-session[251361]: PAM service(sshd) ignoring max retries; 6 > 3
Sep 30 21:56:04 compute-0 nova_compute[192810]: 2025-09-30 21:56:04.803 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:05 compute-0 sshd-session[251497]: Invalid user ubuntu from 8.210.178.40 port 37086
Sep 30 21:56:05 compute-0 sshd-session[251497]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 21:56:05 compute-0 sshd-session[251497]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.210.178.40
Sep 30 21:56:07 compute-0 nova_compute[192810]: 2025-09-30 21:56:07.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:07 compute-0 nova_compute[192810]: 2025-09-30 21:56:07.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:07 compute-0 nova_compute[192810]: 2025-09-30 21:56:07.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:07 compute-0 nova_compute[192810]: 2025-09-30 21:56:07.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:56:08 compute-0 sshd-session[251497]: Failed password for invalid user ubuntu from 8.210.178.40 port 37086 ssh2
Sep 30 21:56:08 compute-0 nova_compute[192810]: 2025-09-30 21:56:08.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:09 compute-0 sshd-session[251497]: Connection closed by invalid user ubuntu 8.210.178.40 port 37086 [preauth]
Sep 30 21:56:10 compute-0 podman[251499]: 2025-09-30 21:56:10.322176369 +0000 UTC m=+0.058905974 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:56:10 compute-0 podman[251500]: 2025-09-30 21:56:10.352397146 +0000 UTC m=+0.087832248 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:56:12 compute-0 nova_compute[192810]: 2025-09-30 21:56:12.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:12 compute-0 nova_compute[192810]: 2025-09-30 21:56:12.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:12 compute-0 nova_compute[192810]: 2025-09-30 21:56:12.790 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:56:12 compute-0 nova_compute[192810]: 2025-09-30 21:56:12.790 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.067 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.067 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquired lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.067 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.067 2 DEBUG nova.objects.instance [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0139492c-c61e-4cd8-b789-64e00c3d8ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.152 2 DEBUG nova.compute.manager [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.389 2 INFO nova.compute.manager [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] instance snapshotting
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.603 2 INFO nova.virt.libvirt.driver [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Beginning live snapshot process
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:13 compute-0 virtqemud[192233]: invalid argument: disk vda does not have an active block job
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.787 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.845 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk --force-share --output=json -f qcow2" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.846 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.899 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8/disk --force-share --output=json -f qcow2" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.912 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.971 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:13 compute-0 nova_compute[192810]: 2025-09-30 21:56:13.972 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.010 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/534805cc590496cbc5396fb5b9746d26209aed69,backing_fmt=raw /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524.delta 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.011 2 INFO nova.virt.libvirt.driver [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Quiescing instance not available: QEMU guest agent is not enabled.
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.066 2 DEBUG nova.virt.libvirt.guest [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] COPY block job progress, current cursor: 0 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.570 2 DEBUG nova.virt.libvirt.guest [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] COPY block job progress, current cursor: 1048576 final cursor: 1048576 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.573 2 INFO nova.virt.libvirt.driver [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Skipping quiescing instance: QEMU guest agent is not enabled.
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.612 2 DEBUG nova.privsep.utils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Sep 30 21:56:14 compute-0 nova_compute[192810]: 2025-09-30 21:56:14.612 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524.delta /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.103 2 DEBUG oslo_concurrency.processutils [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524.delta /var/lib/nova/instances/snapshots/tmphnrf1h6j/a700b38fd85b4d64b129ca19c2891524" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.105 2 INFO nova.virt.libvirt.driver [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Snapshot extracted, beginning image upload
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.113 2 DEBUG nova.network.neutron [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.150 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Releasing lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.150 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Sep 30 21:56:15 compute-0 nova_compute[192810]: 2025-09-30 21:56:15.151 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:17 compute-0 podman[251572]: 2025-09-30 21:56:17.310615988 +0000 UTC m=+0.053003715 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:56:17 compute-0 podman[251573]: 2025-09-30 21:56:17.322319565 +0000 UTC m=+0.061849010 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 21:56:17 compute-0 podman[251574]: 2025-09-30 21:56:17.322353565 +0000 UTC m=+0.053644961 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:56:17 compute-0 nova_compute[192810]: 2025-09-30 21:56:17.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:17 compute-0 nova_compute[192810]: 2025-09-30 21:56:17.835 2 INFO nova.virt.libvirt.driver [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Snapshot image upload complete
Sep 30 21:56:17 compute-0 nova_compute[192810]: 2025-09-30 21:56:17.835 2 INFO nova.compute.manager [None req-4f2d4bac-3f73-4ff0-9700-b56d474f935d d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Took 4.43 seconds to snapshot the instance on the hypervisor.
Sep 30 21:56:18 compute-0 nova_compute[192810]: 2025-09-30 21:56:18.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:18 compute-0 nova_compute[192810]: 2025-09-30 21:56:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:18 compute-0 nova_compute[192810]: 2025-09-30 21:56:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:18 compute-0 nova_compute[192810]: 2025-09-30 21:56:18.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.325 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.326 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.649 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.650 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.650 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.651 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.651 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.661 2 INFO nova.compute.manager [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Terminating instance
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.670 2 DEBUG nova.compute.manager [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Sep 30 21:56:19 compute-0 kernel: tap4427a885-e4 (unregistering): left promiscuous mode
Sep 30 21:56:19 compute-0 NetworkManager[51733]: <info>  [1759269379.6913] device (tap4427a885-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 ovn_controller[94912]: 2025-09-30T21:56:19Z|00775|binding|INFO|Releasing lport 4427a885-e447-4fc0-92ac-c64a4d480c16 from this chassis (sb_readonly=0)
Sep 30 21:56:19 compute-0 ovn_controller[94912]: 2025-09-30T21:56:19Z|00776|binding|INFO|Setting lport 4427a885-e447-4fc0-92ac-c64a4d480c16 down in Southbound
Sep 30 21:56:19 compute-0 ovn_controller[94912]: 2025-09-30T21:56:19Z|00777|binding|INFO|Removing iface tap4427a885-e4 ovn-installed in OVS
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.706 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:d8:9d 10.100.0.14'], port_security=['fa:16:3e:cf:d8:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '0139492c-c61e-4cd8-b789-64e00c3d8ec8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-36195885-e54a-4c05-b721-98be98333841', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2c2e514e7322435988a7f3bf398623e4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '98fe3b07-2473-4235-becf-2f443e01bab9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d62c416-32f7-4ff1-bc52-93428ed2b707, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>], logical_port=4427a885-e447-4fc0-92ac-c64a4d480c16) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe0bd716670>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.707 103867 INFO neutron.agent.ovn.metadata.agent [-] Port 4427a885-e447-4fc0-92ac-c64a4d480c16 in datapath 36195885-e54a-4c05-b721-98be98333841 unbound from our chassis
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.708 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 36195885-e54a-4c05-b721-98be98333841, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.710 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[1ecadf03-019e-401f-9139-c0ea6b54cb9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.710 103867 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-36195885-e54a-4c05-b721-98be98333841 namespace which is not needed anymore
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b9.scope: Deactivated successfully.
Sep 30 21:56:19 compute-0 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000b9.scope: Consumed 13.867s CPU time.
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.760 2 DEBUG nova.compute.manager [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.761 2 DEBUG nova.compute.manager [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing instance network info cache due to event network-changed-4427a885-e447-4fc0-92ac-c64a4d480c16. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.761 2 DEBUG oslo_concurrency.lockutils [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.762 2 DEBUG oslo_concurrency.lockutils [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquired lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Sep 30 21:56:19 compute-0 systemd-machined[152794]: Machine qemu-91-instance-000000b9 terminated.
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.762 2 DEBUG nova.network.neutron [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Refreshing network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Sep 30 21:56:19 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [NOTICE]   (251304) : haproxy version is 2.8.14-c23fe91
Sep 30 21:56:19 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [NOTICE]   (251304) : path to executable is /usr/sbin/haproxy
Sep 30 21:56:19 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [WARNING]  (251304) : Exiting Master process...
Sep 30 21:56:19 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [ALERT]    (251304) : Current worker (251306) exited with code 143 (Terminated)
Sep 30 21:56:19 compute-0 neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841[251300]: [WARNING]  (251304) : All workers exited. Exiting... (0)
Sep 30 21:56:19 compute-0 systemd[1]: libpod-d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8.scope: Deactivated successfully.
Sep 30 21:56:19 compute-0 podman[251661]: 2025-09-30 21:56:19.849046951 +0000 UTC m=+0.043804612 container died d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:56:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8-userdata-shm.mount: Deactivated successfully.
Sep 30 21:56:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-02de7b089857629fed75727bad4e0cbf996ab16b57a2ab08ddd1abe9ce02cc39-merged.mount: Deactivated successfully.
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.897 2 DEBUG nova.compute.manager [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-unplugged-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.898 2 DEBUG oslo_concurrency.lockutils [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:19 compute-0 podman[251661]: 2025-09-30 21:56:19.898402082 +0000 UTC m=+0.093159743 container cleanup d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.898 2 DEBUG oslo_concurrency.lockutils [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.898 2 DEBUG oslo_concurrency.lockutils [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.898 2 DEBUG nova.compute.manager [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] No waiting events found dispatching network-vif-unplugged-4427a885-e447-4fc0-92ac-c64a4d480c16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.899 2 DEBUG nova.compute.manager [req-6d49ddc5-cb2f-440d-a066-65eb658f999c req-d4d496e5-ba22-47de-8361-2005f7420d72 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-unplugged-4427a885-e447-4fc0-92ac-c64a4d480c16 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Sep 30 21:56:19 compute-0 systemd[1]: libpod-conmon-d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8.scope: Deactivated successfully.
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.929 2 INFO nova.virt.libvirt.driver [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Instance destroyed successfully.
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.930 2 DEBUG nova.objects.instance [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lazy-loading 'resources' on Instance uuid 0139492c-c61e-4cd8-b789-64e00c3d8ec8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Sep 30 21:56:19 compute-0 podman[251703]: 2025-09-30 21:56:19.966786476 +0000 UTC m=+0.042704404 container remove d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.966 2 DEBUG nova.virt.libvirt.vif [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-09-30T21:55:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1448730658',display_name='tempest-TestSnapshotPattern-server-1448730658',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1448730658',id=185,image_ref='0dfd446c-9899-4202-86e2-c09f7b33b375',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLKL8WwnYuJEP3XpY/ju5SXa+fZd+0s4ElE/Ammc3JO3wP15Y53TJ4QGSyyMbttI4T5Fjj/YGgDR1amj6cHQX5O4wQ/GeWnvDWjS/d7Zz3S4MDwj0ljVzMOx5HSsjDMRAA==',key_name='tempest-TestSnapshotPattern-42320000',keypairs=<?>,launch_index=0,launched_at=2025-09-30T21:55:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2c2e514e7322435988a7f3bf398623e4',ramdisk_id='',reservation_id='r-fn3p0bkg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='86b6907c-d747-4e98-8897-42105915831d',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='70bc4ef2-d80e-49af-b20a-6240f762b81e',image_min_disk='1',image_min_ram='0',image_owner_id='2c2e514e7322435988a7f3bf398623e4',image_owner_project_name='tempest-TestSnapshotPattern-1968938915',image_owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member',image_user_id='d6d7afba807d47549781e37178a01774',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-1968938915',owner_user_name='tempest-TestSnapshotPattern-1968938915-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T21:56:17Z,user_data=None,user_id='d6d7afba807d47549781e37178a01774',uuid=0139492c-c61e-4cd8-b789-64e00c3d8ec8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.967 2 DEBUG nova.network.os_vif_util [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converting VIF {"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.968 2 DEBUG nova.network.os_vif_util [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.968 2 DEBUG os_vif [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.971 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4427a885-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.971 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe39370-ab0a-4794-8096-333966d1f139]: (4, ('Tue Sep 30 09:56:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841 (d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8)\nd7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8\nTue Sep 30 09:56:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-36195885-e54a-4c05-b721-98be98333841 (d7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8)\nd7d0ea26832e146211d38cbce3b4caeacc5da7163bb8c78364b50be7cd8068c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.974 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[eeafc783-f46e-49a5-9b76-61330df97ff0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.975 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36195885-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.976 2 INFO os_vif [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:d8:9d,bridge_name='br-int',has_traffic_filtering=True,id=4427a885-e447-4fc0-92ac-c64a4d480c16,network=Network(36195885-e54a-4c05-b721-98be98333841),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4427a885-e4')
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.976 2 INFO nova.virt.libvirt.driver [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Deleting instance files /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8_del
Sep 30 21:56:19 compute-0 kernel: tap36195885-e0: left promiscuous mode
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.978 2 INFO nova.virt.libvirt.driver [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Deletion of /var/lib/nova/instances/0139492c-c61e-4cd8-b789-64e00c3d8ec8_del complete
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 nova_compute[192810]: 2025-09-30 21:56:19.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:19 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:19.990 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[95be4886-4e87-4313-a83d-acd96bdab356]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:20.022 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[0119af10-c912-4358-82ae-c07b0e29b063]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:20.023 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d6b930-2fe6-4778-b907-0fd61e1b7971]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:20.038 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb13df9-a01f-4fc1-8653-6ff37677babf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 602928, 'reachable_time': 26118, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251723, 'error': None, 'target': 'ovnmeta-36195885-e54a-4c05-b721-98be98333841', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:20.040 103980 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-36195885-e54a-4c05-b721-98be98333841 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Sep 30 21:56:20 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:20.041 103980 DEBUG oslo.privsep.daemon [-] privsep: reply[83dfb7b2-d8ad-4683-a2fe-131f08adbb78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:56:20 compute-0 systemd[1]: run-netns-ovnmeta\x2d36195885\x2de54a\x2d4c05\x2db721\x2d98be98333841.mount: Deactivated successfully.
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.049 2 INFO nova.compute.manager [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Took 0.38 seconds to destroy the instance on the hypervisor.
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.050 2 DEBUG oslo.service.loopingcall [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.050 2 DEBUG nova.compute.manager [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.050 2 DEBUG nova.network.neutron [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.555 2 DEBUG nova.network.neutron [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.572 2 INFO nova.compute.manager [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Took 0.52 seconds to deallocate network for instance.
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.648 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.648 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.783 2 DEBUG nova.compute.provider_tree [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.813 2 DEBUG nova.scheduler.client.report [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.819 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.837 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.838 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.841 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.841 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.841 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.872 2 INFO nova.scheduler.client.report [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Deleted allocations for instance 0139492c-c61e-4cd8-b789-64e00c3d8ec8
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.940 2 DEBUG nova.network.neutron [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updated VIF entry in instance network info cache for port 4427a885-e447-4fc0-92ac-c64a4d480c16. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.941 2 DEBUG nova.network.neutron [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Updating instance_info_cache with network_info: [{"id": "4427a885-e447-4fc0-92ac-c64a4d480c16", "address": "fa:16:3e:cf:d8:9d", "network": {"id": "36195885-e54a-4c05-b721-98be98333841", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-292865923-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2c2e514e7322435988a7f3bf398623e4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4427a885-e4", "ovs_interfaceid": "4427a885-e447-4fc0-92ac-c64a4d480c16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.959 2 DEBUG oslo_concurrency.lockutils [None req-a5184cfe-0633-492a-828b-4140eed10797 d6d7afba807d47549781e37178a01774 2c2e514e7322435988a7f3bf398623e4 - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:20 compute-0 nova_compute[192810]: 2025-09-30 21:56:20.963 2 DEBUG oslo_concurrency.lockutils [req-e16417a9-16d4-428d-988a-2a5e768f0459 req-12872110-ba8b-4f40-bccc-ceef8a514ddc dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Releasing lock "refresh_cache-0139492c-c61e-4cd8-b789-64e00c3d8ec8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.035 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.036 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=73.09423446655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.036 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.037 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.093 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.094 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.122 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.146 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.173 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.173 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.880 2 DEBUG nova.compute.manager [req-24e40c7a-22a3-4259-b483-90f160e026ce req-bc5d1f57-4af3-4c72-b962-1e034ef28a24 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-deleted-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.980 2 DEBUG nova.compute.manager [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.980 2 DEBUG oslo_concurrency.lockutils [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Acquiring lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.981 2 DEBUG oslo_concurrency.lockutils [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.981 2 DEBUG oslo_concurrency.lockutils [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] Lock "0139492c-c61e-4cd8-b789-64e00c3d8ec8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.981 2 DEBUG nova.compute.manager [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] No waiting events found dispatching network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Sep 30 21:56:21 compute-0 nova_compute[192810]: 2025-09-30 21:56:21.981 2 WARNING nova.compute.manager [req-eeaab9d3-2d71-4d68-85ea-30950c1fd390 req-af9ce950-131e-4d19-afab-2f1862c3d5f6 dfb6e89f7aaa48df8c2e7a808a873ead f1a1b5d7ca4a475685fb4155a461748d - - default default] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Received unexpected event network-vif-plugged-4427a885-e447-4fc0-92ac-c64a4d480c16 for instance with vm_state deleted and task_state None.
Sep 30 21:56:22 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:22.327 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:56:23 compute-0 nova_compute[192810]: 2025-09-30 21:56:23.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:24 compute-0 nova_compute[192810]: 2025-09-30 21:56:24.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:28 compute-0 nova_compute[192810]: 2025-09-30 21:56:28.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:29 compute-0 nova_compute[192810]: 2025-09-30 21:56:29.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:31 compute-0 podman[251726]: 2025-09-30 21:56:31.318619337 +0000 UTC m=+0.056130364 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:56:31 compute-0 podman[251727]: 2025-09-30 21:56:31.326780224 +0000 UTC m=+0.060042083 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:56:31 compute-0 podman[251725]: 2025-09-30 21:56:31.355269967 +0000 UTC m=+0.094124708 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:56:33 compute-0 nova_compute[192810]: 2025-09-30 21:56:33.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:33 compute-0 nova_compute[192810]: 2025-09-30 21:56:33.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:33 compute-0 nova_compute[192810]: 2025-09-30 21:56:33.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:34 compute-0 nova_compute[192810]: 2025-09-30 21:56:34.929 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759269379.9279404, 0139492c-c61e-4cd8-b789-64e00c3d8ec8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Sep 30 21:56:34 compute-0 nova_compute[192810]: 2025-09-30 21:56:34.929 2 INFO nova.compute.manager [-] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] VM Stopped (Lifecycle Event)
Sep 30 21:56:34 compute-0 nova_compute[192810]: 2025-09-30 21:56:34.951 2 DEBUG nova.compute.manager [None req-f603908e-5635-477c-a950-b0f22e04c58c - - - - - -] [instance: 0139492c-c61e-4cd8-b789-64e00c3d8ec8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Sep 30 21:56:34 compute-0 nova_compute[192810]: 2025-09-30 21:56:34.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:38.761 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:56:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:38.762 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:56:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:56:38.762 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:56:38 compute-0 nova_compute[192810]: 2025-09-30 21:56:38.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:39 compute-0 nova_compute[192810]: 2025-09-30 21:56:39.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:41 compute-0 podman[251791]: 2025-09-30 21:56:41.313404671 +0000 UTC m=+0.055075057 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:56:41 compute-0 podman[251792]: 2025-09-30 21:56:41.36187829 +0000 UTC m=+0.088017772 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.expose-services=, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, vcs-type=git)
Sep 30 21:56:43 compute-0 nova_compute[192810]: 2025-09-30 21:56:43.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:56:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:56:44 compute-0 nova_compute[192810]: 2025-09-30 21:56:44.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:48 compute-0 podman[251842]: 2025-09-30 21:56:48.314731307 +0000 UTC m=+0.045957086 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:56:48 compute-0 podman[251840]: 2025-09-30 21:56:48.315187998 +0000 UTC m=+0.055181740 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:56:48 compute-0 podman[251841]: 2025-09-30 21:56:48.321445207 +0000 UTC m=+0.056070873 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid)
Sep 30 21:56:48 compute-0 nova_compute[192810]: 2025-09-30 21:56:48.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:49 compute-0 nova_compute[192810]: 2025-09-30 21:56:49.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:53 compute-0 nova_compute[192810]: 2025-09-30 21:56:53.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:55 compute-0 nova_compute[192810]: 2025-09-30 21:56:55.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:56:58 compute-0 nova_compute[192810]: 2025-09-30 21:56:58.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:00 compute-0 nova_compute[192810]: 2025-09-30 21:57:00.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:01.858 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:df:1d 10.100.0.2 2001:db8::f816:3eff:fef3:df1d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef3:df1d/64', 'neutron:device_id': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=14f43e9d-ff95-45ca-8ef3-d794e65d228a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=54acfaa1-75af-46fc-b755-5aabbfb79138) old=Port_Binding(mac=['fa:16:3e:f3:df:1d 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5e8390b-42ff-40d7-bb46-05b4a7f0a027', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71b1e8c3c45e4ff8bc99e66bd1bfef7c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:57:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:01.860 103867 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 54acfaa1-75af-46fc-b755-5aabbfb79138 in datapath b5e8390b-42ff-40d7-bb46-05b4a7f0a027 updated
Sep 30 21:57:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:01.862 103867 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5e8390b-42ff-40d7-bb46-05b4a7f0a027, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Sep 30 21:57:01 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:01.863 220624 DEBUG oslo.privsep.daemon [-] privsep: reply[b30ff903-7133-4f80-af5d-aa5988a78845]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Sep 30 21:57:02 compute-0 podman[251902]: 2025-09-30 21:57:02.314429552 +0000 UTC m=+0.050448560 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:57:02 compute-0 podman[251903]: 2025-09-30 21:57:02.323239596 +0000 UTC m=+0.055731955 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250923)
Sep 30 21:57:02 compute-0 podman[251901]: 2025-09-30 21:57:02.368417751 +0000 UTC m=+0.102238013 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:57:03 compute-0 nova_compute[192810]: 2025-09-30 21:57:03.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:05 compute-0 nova_compute[192810]: 2025-09-30 21:57:05.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:07 compute-0 nova_compute[192810]: 2025-09-30 21:57:07.141 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:07 compute-0 nova_compute[192810]: 2025-09-30 21:57:07.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:07 compute-0 nova_compute[192810]: 2025-09-30 21:57:07.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:57:08 compute-0 nova_compute[192810]: 2025-09-30 21:57:08.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:09 compute-0 nova_compute[192810]: 2025-09-30 21:57:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:10 compute-0 nova_compute[192810]: 2025-09-30 21:57:10.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:12 compute-0 podman[251965]: 2025-09-30 21:57:12.321517956 +0000 UTC m=+0.058446453 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:57:12 compute-0 podman[251966]: 2025-09-30 21:57:12.360440473 +0000 UTC m=+0.090037804 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm)
Sep 30 21:57:13 compute-0 nova_compute[192810]: 2025-09-30 21:57:13.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:14 compute-0 nova_compute[192810]: 2025-09-30 21:57:14.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:14 compute-0 nova_compute[192810]: 2025-09-30 21:57:14.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:57:14 compute-0 nova_compute[192810]: 2025-09-30 21:57:14.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:57:14 compute-0 nova_compute[192810]: 2025-09-30 21:57:14.809 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:57:15 compute-0 nova_compute[192810]: 2025-09-30 21:57:15.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:16 compute-0 nova_compute[192810]: 2025-09-30 21:57:16.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:17 compute-0 nova_compute[192810]: 2025-09-30 21:57:17.782 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:18 compute-0 nova_compute[192810]: 2025-09-30 21:57:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:18 compute-0 nova_compute[192810]: 2025-09-30 21:57:18.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:19 compute-0 podman[252010]: 2025-09-30 21:57:19.312488737 +0000 UTC m=+0.050246935 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923)
Sep 30 21:57:19 compute-0 podman[252011]: 2025-09-30 21:57:19.321349132 +0000 UTC m=+0.054017761 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, org.label-schema.vendor=CentOS)
Sep 30 21:57:19 compute-0 podman[252012]: 2025-09-30 21:57:19.321578298 +0000 UTC m=+0.050414239 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.816 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.972 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.973 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5722MB free_disk=73.09423446655273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.973 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:20 compute-0 nova_compute[192810]: 2025-09-30 21:57:20.973 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.047 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.047 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.305 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.329 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.329 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.348 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.386 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.413 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.448 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.449 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.450 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:21 compute-0 nova_compute[192810]: 2025-09-30 21:57:21.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:21.535 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:57:21 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:21.536 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:57:23 compute-0 nova_compute[192810]: 2025-09-30 21:57:23.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:25 compute-0 nova_compute[192810]: 2025-09-30 21:57:25.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:28 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:28.538 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:57:28 compute-0 nova_compute[192810]: 2025-09-30 21:57:28.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:28 compute-0 ovn_controller[94912]: 2025-09-30T21:57:28Z|00778|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 21:57:30 compute-0 nova_compute[192810]: 2025-09-30 21:57:30.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:33 compute-0 podman[252069]: 2025-09-30 21:57:33.32706123 +0000 UTC m=+0.055451637 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:57:33 compute-0 podman[252068]: 2025-09-30 21:57:33.356512397 +0000 UTC m=+0.092700942 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, tcib_managed=true)
Sep 30 21:57:33 compute-0 podman[252070]: 2025-09-30 21:57:33.36729738 +0000 UTC m=+0.094586689 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Sep 30 21:57:33 compute-0 nova_compute[192810]: 2025-09-30 21:57:33.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:35 compute-0 nova_compute[192810]: 2025-09-30 21:57:35.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:38.762 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:57:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:38.763 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:57:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:57:38.763 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:57:38 compute-0 nova_compute[192810]: 2025-09-30 21:57:38.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:40 compute-0 nova_compute[192810]: 2025-09-30 21:57:40.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:43 compute-0 podman[252132]: 2025-09-30 21:57:43.308537236 +0000 UTC m=+0.051206790 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:57:43 compute-0 podman[252133]: 2025-09-30 21:57:43.308649859 +0000 UTC m=+0.046919451 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm)
Sep 30 21:57:43 compute-0 nova_compute[192810]: 2025-09-30 21:57:43.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:45 compute-0 nova_compute[192810]: 2025-09-30 21:57:45.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:48 compute-0 nova_compute[192810]: 2025-09-30 21:57:48.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:50 compute-0 nova_compute[192810]: 2025-09-30 21:57:50.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:50 compute-0 podman[252175]: 2025-09-30 21:57:50.306882406 +0000 UTC m=+0.048613374 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd)
Sep 30 21:57:50 compute-0 podman[252177]: 2025-09-30 21:57:50.313241437 +0000 UTC m=+0.048694476 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:57:50 compute-0 podman[252176]: 2025-09-30 21:57:50.319242359 +0000 UTC m=+0.057256013 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 21:57:53 compute-0 nova_compute[192810]: 2025-09-30 21:57:53.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:55 compute-0 nova_compute[192810]: 2025-09-30 21:57:55.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:57:58 compute-0 nova_compute[192810]: 2025-09-30 21:57:58.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:00 compute-0 nova_compute[192810]: 2025-09-30 21:58:00.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:03 compute-0 nova_compute[192810]: 2025-09-30 21:58:03.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:04 compute-0 podman[252237]: 2025-09-30 21:58:04.312216245 +0000 UTC m=+0.047069084 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Sep 30 21:58:04 compute-0 podman[252235]: 2025-09-30 21:58:04.338107332 +0000 UTC m=+0.077195088 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 21:58:04 compute-0 podman[252236]: 2025-09-30 21:58:04.338336248 +0000 UTC m=+0.073760491 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 21:58:05 compute-0 nova_compute[192810]: 2025-09-30 21:58:05.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:08 compute-0 nova_compute[192810]: 2025-09-30 21:58:08.451 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:08 compute-0 nova_compute[192810]: 2025-09-30 21:58:08.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:09 compute-0 nova_compute[192810]: 2025-09-30 21:58:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:09 compute-0 nova_compute[192810]: 2025-09-30 21:58:09.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:09 compute-0 nova_compute[192810]: 2025-09-30 21:58:09.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:58:10 compute-0 nova_compute[192810]: 2025-09-30 21:58:10.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:12.226 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 21:58:12 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:12.227 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 21:58:12 compute-0 nova_compute[192810]: 2025-09-30 21:58:12.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:13 compute-0 nova_compute[192810]: 2025-09-30 21:58:13.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:14 compute-0 podman[252294]: 2025-09-30 21:58:14.314296163 +0000 UTC m=+0.052573804 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 21:58:14 compute-0 podman[252295]: 2025-09-30 21:58:14.320183793 +0000 UTC m=+0.055461127 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Sep 30 21:58:15 compute-0 nova_compute[192810]: 2025-09-30 21:58:15.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:15 compute-0 nova_compute[192810]: 2025-09-30 21:58:15.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:15 compute-0 nova_compute[192810]: 2025-09-30 21:58:15.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:58:15 compute-0 nova_compute[192810]: 2025-09-30 21:58:15.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:58:15 compute-0 nova_compute[192810]: 2025-09-30 21:58:15.803 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:58:17 compute-0 nova_compute[192810]: 2025-09-30 21:58:17.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:18 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:18.229 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 21:58:18 compute-0 nova_compute[192810]: 2025-09-30 21:58:18.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:18 compute-0 nova_compute[192810]: 2025-09-30 21:58:18.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:20 compute-0 nova_compute[192810]: 2025-09-30 21:58:20.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:20 compute-0 nova_compute[192810]: 2025-09-30 21:58:20.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:20 compute-0 nova_compute[192810]: 2025-09-30 21:58:20.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:21 compute-0 podman[252340]: 2025-09-30 21:58:21.314730566 +0000 UTC m=+0.046301545 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:58:21 compute-0 podman[252339]: 2025-09-30 21:58:21.315055865 +0000 UTC m=+0.049542868 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:58:21 compute-0 podman[252338]: 2025-09-30 21:58:21.340366706 +0000 UTC m=+0.079100096 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true)
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.823 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.824 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.824 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.955 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.956 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.09478759765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.956 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:21 compute-0 nova_compute[192810]: 2025-09-30 21:58:21.956 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.006 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.006 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.027 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.045 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.046 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:58:22 compute-0 nova_compute[192810]: 2025-09-30 21:58:22.046 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:23 compute-0 nova_compute[192810]: 2025-09-30 21:58:23.042 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:58:23 compute-0 nova_compute[192810]: 2025-09-30 21:58:23.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:25 compute-0 nova_compute[192810]: 2025-09-30 21:58:25.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:28 compute-0 nova_compute[192810]: 2025-09-30 21:58:28.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:30 compute-0 nova_compute[192810]: 2025-09-30 21:58:30.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:33 compute-0 nova_compute[192810]: 2025-09-30 21:58:33.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:35 compute-0 nova_compute[192810]: 2025-09-30 21:58:35.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:35 compute-0 podman[252398]: 2025-09-30 21:58:35.31629438 +0000 UTC m=+0.050076741 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 21:58:35 compute-0 podman[252399]: 2025-09-30 21:58:35.35103469 +0000 UTC m=+0.078328757 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 21:58:35 compute-0 podman[252397]: 2025-09-30 21:58:35.37627715 +0000 UTC m=+0.110185064 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Sep 30 21:58:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:38.763 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:58:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:38.764 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:58:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:58:38.764 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:58:38 compute-0 nova_compute[192810]: 2025-09-30 21:58:38.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:40 compute-0 nova_compute[192810]: 2025-09-30 21:58:40.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:43 compute-0 nova_compute[192810]: 2025-09-30 21:58:43.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.912 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.913 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 21:58:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 21:58:45 compute-0 nova_compute[192810]: 2025-09-30 21:58:45.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:45 compute-0 podman[252462]: 2025-09-30 21:58:45.31727144 +0000 UTC m=+0.053203089 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Sep 30 21:58:45 compute-0 podman[252461]: 2025-09-30 21:58:45.338618962 +0000 UTC m=+0.076823099 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:58:45 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 251838
Sep 30 21:58:48 compute-0 nova_compute[192810]: 2025-09-30 21:58:48.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:50 compute-0 nova_compute[192810]: 2025-09-30 21:58:50.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:52 compute-0 podman[252502]: 2025-09-30 21:58:52.357541143 +0000 UTC m=+0.081382465 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 21:58:52 compute-0 podman[252503]: 2025-09-30 21:58:52.358876407 +0000 UTC m=+0.074817438 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 21:58:52 compute-0 podman[252501]: 2025-09-30 21:58:52.374479522 +0000 UTC m=+0.095105972 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 21:58:53 compute-0 nova_compute[192810]: 2025-09-30 21:58:53.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:55 compute-0 nova_compute[192810]: 2025-09-30 21:58:55.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:58:58 compute-0 nova_compute[192810]: 2025-09-30 21:58:58.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:00 compute-0 nova_compute[192810]: 2025-09-30 21:59:00.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:03 compute-0 nova_compute[192810]: 2025-09-30 21:59:03.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:05 compute-0 nova_compute[192810]: 2025-09-30 21:59:05.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:06 compute-0 podman[252564]: 2025-09-30 21:59:06.337734353 +0000 UTC m=+0.059603482 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 21:59:06 compute-0 podman[252565]: 2025-09-30 21:59:06.353534304 +0000 UTC m=+0.065881672 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 21:59:06 compute-0 podman[252563]: 2025-09-30 21:59:06.366631036 +0000 UTC m=+0.093431580 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 21:59:08 compute-0 nova_compute[192810]: 2025-09-30 21:59:08.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:09 compute-0 nova_compute[192810]: 2025-09-30 21:59:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:09 compute-0 nova_compute[192810]: 2025-09-30 21:59:09.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:09 compute-0 nova_compute[192810]: 2025-09-30 21:59:09.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 21:59:10 compute-0 nova_compute[192810]: 2025-09-30 21:59:10.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:10 compute-0 nova_compute[192810]: 2025-09-30 21:59:10.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:13 compute-0 nova_compute[192810]: 2025-09-30 21:59:13.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:15 compute-0 nova_compute[192810]: 2025-09-30 21:59:15.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:16 compute-0 podman[252625]: 2025-09-30 21:59:16.322973405 +0000 UTC m=+0.060827493 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 21:59:16 compute-0 podman[252626]: 2025-09-30 21:59:16.324855643 +0000 UTC m=+0.060515616 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 21:59:17 compute-0 nova_compute[192810]: 2025-09-30 21:59:17.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:17 compute-0 nova_compute[192810]: 2025-09-30 21:59:17.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 21:59:17 compute-0 nova_compute[192810]: 2025-09-30 21:59:17.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 21:59:17 compute-0 nova_compute[192810]: 2025-09-30 21:59:17.811 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 21:59:18 compute-0 nova_compute[192810]: 2025-09-30 21:59:18.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:19 compute-0 nova_compute[192810]: 2025-09-30 21:59:19.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:19 compute-0 nova_compute[192810]: 2025-09-30 21:59:19.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:20 compute-0 nova_compute[192810]: 2025-09-30 21:59:20.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.825 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.826 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.826 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.826 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.997 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.998 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5718MB free_disk=73.09478759765625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.998 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:21 compute-0 nova_compute[192810]: 2025-09-30 21:59:21.998 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.100 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.101 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.139 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.163 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.164 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 21:59:22 compute-0 nova_compute[192810]: 2025-09-30 21:59:22.164 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:23 compute-0 podman[252670]: 2025-09-30 21:59:23.322382571 +0000 UTC m=+0.055312762 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 21:59:23 compute-0 podman[252671]: 2025-09-30 21:59:23.323273184 +0000 UTC m=+0.054065571 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 21:59:23 compute-0 podman[252669]: 2025-09-30 21:59:23.345585129 +0000 UTC m=+0.081931397 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 21:59:23 compute-0 nova_compute[192810]: 2025-09-30 21:59:23.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:25 compute-0 nova_compute[192810]: 2025-09-30 21:59:25.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:28 compute-0 nova_compute[192810]: 2025-09-30 21:59:28.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:30 compute-0 nova_compute[192810]: 2025-09-30 21:59:30.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:33 compute-0 nova_compute[192810]: 2025-09-30 21:59:33.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:35 compute-0 nova_compute[192810]: 2025-09-30 21:59:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:37 compute-0 podman[252730]: 2025-09-30 21:59:37.308512055 +0000 UTC m=+0.045596017 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 21:59:37 compute-0 podman[252731]: 2025-09-30 21:59:37.34933261 +0000 UTC m=+0.083646152 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 21:59:37 compute-0 podman[252729]: 2025-09-30 21:59:37.402649222 +0000 UTC m=+0.143247743 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 21:59:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:59:38.764 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 21:59:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:59:38.765 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 21:59:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 21:59:38.765 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 21:59:38 compute-0 nova_compute[192810]: 2025-09-30 21:59:38.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:40 compute-0 nova_compute[192810]: 2025-09-30 21:59:40.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:43 compute-0 nova_compute[192810]: 2025-09-30 21:59:43.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:45 compute-0 nova_compute[192810]: 2025-09-30 21:59:45.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:47 compute-0 podman[252793]: 2025-09-30 21:59:47.30768639 +0000 UTC m=+0.050203695 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 21:59:47 compute-0 podman[252794]: 2025-09-30 21:59:47.323360097 +0000 UTC m=+0.061928851 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, vcs-type=git)
Sep 30 21:59:48 compute-0 nova_compute[192810]: 2025-09-30 21:59:48.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:50 compute-0 nova_compute[192810]: 2025-09-30 21:59:50.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:53 compute-0 nova_compute[192810]: 2025-09-30 21:59:53.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:54 compute-0 podman[252834]: 2025-09-30 21:59:54.311401176 +0000 UTC m=+0.050402769 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 21:59:54 compute-0 podman[252835]: 2025-09-30 21:59:54.32340366 +0000 UTC m=+0.058135555 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 21:59:54 compute-0 podman[252836]: 2025-09-30 21:59:54.323429341 +0000 UTC m=+0.056043812 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 21:59:55 compute-0 nova_compute[192810]: 2025-09-30 21:59:55.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 21:59:55 compute-0 sshd-session[252891]: Accepted publickey for zuul from 192.168.122.10 port 51996 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 21:59:55 compute-0 systemd-logind[792]: New session 42 of user zuul.
Sep 30 21:59:55 compute-0 systemd[1]: Started Session 42 of User zuul.
Sep 30 21:59:55 compute-0 sshd-session[252891]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 21:59:55 compute-0 sudo[252895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 21:59:55 compute-0 sudo[252895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 21:59:58 compute-0 nova_compute[192810]: 2025-09-30 21:59:58.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:00 compute-0 nova_compute[192810]: 2025-09-30 22:00:00.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:01 compute-0 ovs-vsctl[253070]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 22:00:01 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 252919 (sos)
Sep 30 22:00:01 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Sep 30 22:00:01 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Sep 30 22:00:02 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 22:00:02 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 22:00:02 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:00:02 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Sep 30 22:00:03 compute-0 crontab[253499]: (root) LIST (root)
Sep 30 22:00:03 compute-0 nova_compute[192810]: 2025-09-30 22:00:03.880 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:05 compute-0 nova_compute[192810]: 2025-09-30 22:00:05.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:05 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 22:00:05 compute-0 systemd[1]: Started Hostname Service.
Sep 30 22:00:07 compute-0 podman[253707]: 2025-09-30 22:00:07.862043545 +0000 UTC m=+0.061120400 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:00:07 compute-0 podman[253705]: 2025-09-30 22:00:07.862674941 +0000 UTC m=+0.066414715 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:00:07 compute-0 podman[253702]: 2025-09-30 22:00:07.911847968 +0000 UTC m=+0.119097791 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 22:00:08 compute-0 nova_compute[192810]: 2025-09-30 22:00:08.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:09 compute-0 nova_compute[192810]: 2025-09-30 22:00:09.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:09 compute-0 nova_compute[192810]: 2025-09-30 22:00:09.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:09 compute-0 nova_compute[192810]: 2025-09-30 22:00:09.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:00:09 compute-0 nova_compute[192810]: 2025-09-30 22:00:09.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:09 compute-0 nova_compute[192810]: 2025-09-30 22:00:09.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:00:10 compute-0 nova_compute[192810]: 2025-09-30 22:00:10.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:11 compute-0 nova_compute[192810]: 2025-09-30 22:00:11.433 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:11 compute-0 nova_compute[192810]: 2025-09-30 22:00:11.836 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:12 compute-0 ovs-appctl[254595]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:12 compute-0 ovs-appctl[254599]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:12 compute-0 ovs-appctl[254603]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 22:00:13 compute-0 nova_compute[192810]: 2025-09-30 22:00:13.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:15 compute-0 nova_compute[192810]: 2025-09-30 22:00:15.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:18 compute-0 podman[255806]: 2025-09-30 22:00:18.070386523 +0000 UTC m=+0.069904263 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:00:18 compute-0 podman[255807]: 2025-09-30 22:00:18.070971568 +0000 UTC m=+0.069010571 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, name=ubi9-minimal, version=9.6, config_id=edpm)
Sep 30 22:00:18 compute-0 nova_compute[192810]: 2025-09-30 22:00:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:18 compute-0 nova_compute[192810]: 2025-09-30 22:00:18.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:00:18 compute-0 nova_compute[192810]: 2025-09-30 22:00:18.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:00:18 compute-0 nova_compute[192810]: 2025-09-30 22:00:18.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:18 compute-0 nova_compute[192810]: 2025-09-30 22:00:18.950 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:00:19 compute-0 unix_chkpwd[256145]: password check failed for user (root)
Sep 30 22:00:19 compute-0 sshd-session[255870]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=root
Sep 30 22:00:19 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:00:20 compute-0 nova_compute[192810]: 2025-09-30 22:00:20.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:20 compute-0 nova_compute[192810]: 2025-09-30 22:00:20.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:20 compute-0 nova_compute[192810]: 2025-09-30 22:00:20.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.832 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.832 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.833 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.833 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:00:21 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 22:00:21 compute-0 sshd-session[255870]: Failed password for root from 80.94.95.116 port 61644 ssh2
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.978 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.979 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5524MB free_disk=72.49165344238281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.979 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:21 compute-0 nova_compute[192810]: 2025-09-30 22:00:21.980 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:22 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.279 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.280 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.300 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.314 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.345 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:00:22 compute-0 nova_compute[192810]: 2025-09-30 22:00:22.346 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.347 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:23 compute-0 sshd-session[255870]: Connection closed by authenticating user root 80.94.95.116 port 61644 [preauth]
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.810 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:00:23 compute-0 nova_compute[192810]: 2025-09-30 22:00:23.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:25 compute-0 nova_compute[192810]: 2025-09-30 22:00:25.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:25 compute-0 podman[256334]: 2025-09-30 22:00:25.334247236 +0000 UTC m=+0.062968718 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:00:25 compute-0 podman[256335]: 2025-09-30 22:00:25.337490438 +0000 UTC m=+0.060706170 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:00:25 compute-0 podman[256333]: 2025-09-30 22:00:25.363465127 +0000 UTC m=+0.092169838 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 22:00:27 compute-0 nova_compute[192810]: 2025-09-30 22:00:27.805 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:28 compute-0 nova_compute[192810]: 2025-09-30 22:00:28.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:30 compute-0 nova_compute[192810]: 2025-09-30 22:00:30.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:32 compute-0 nova_compute[192810]: 2025-09-30 22:00:32.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:00:33 compute-0 nova_compute[192810]: 2025-09-30 22:00:33.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:35 compute-0 nova_compute[192810]: 2025-09-30 22:00:35.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:38 compute-0 podman[256398]: 2025-09-30 22:00:38.337641603 +0000 UTC m=+0.067866322 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 22:00:38 compute-0 podman[256399]: 2025-09-30 22:00:38.341682805 +0000 UTC m=+0.069795580 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 22:00:38 compute-0 podman[256397]: 2025-09-30 22:00:38.390304378 +0000 UTC m=+0.120391243 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 22:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:00:38.765 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:00:38.765 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:00:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:00:38.766 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:00:38 compute-0 nova_compute[192810]: 2025-09-30 22:00:38.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:40 compute-0 nova_compute[192810]: 2025-09-30 22:00:40.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:43 compute-0 nova_compute[192810]: 2025-09-30 22:00:43.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:00:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:00:45 compute-0 nova_compute[192810]: 2025-09-30 22:00:45.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:48 compute-0 podman[256457]: 2025-09-30 22:00:48.335202676 +0000 UTC m=+0.073172146 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:00:48 compute-0 podman[256458]: 2025-09-30 22:00:48.335847933 +0000 UTC m=+0.070133830 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=9.6)
Sep 30 22:00:48 compute-0 nova_compute[192810]: 2025-09-30 22:00:48.893 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:49 compute-0 sudo[252895]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:49 compute-0 sshd-session[252894]: Received disconnect from 192.168.122.10 port 51996:11: disconnected by user
Sep 30 22:00:49 compute-0 sshd-session[252894]: Disconnected from user zuul 192.168.122.10 port 51996
Sep 30 22:00:49 compute-0 sshd-session[252891]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:49 compute-0 systemd[1]: session-42.scope: Deactivated successfully.
Sep 30 22:00:49 compute-0 systemd[1]: session-42.scope: Consumed 1min 23.799s CPU time, 732.4M memory peak, read 241.6M from disk, written 18.4M to disk.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Session 42 logged out. Waiting for processes to exit.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Removed session 42.
Sep 30 22:00:49 compute-0 sshd-session[256501]: Accepted publickey for zuul from 192.168.122.10 port 57680 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:00:49 compute-0 systemd-logind[792]: New session 43 of user zuul.
Sep 30 22:00:49 compute-0 systemd[1]: Started Session 43 of User zuul.
Sep 30 22:00:49 compute-0 sshd-session[256501]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:00:49 compute-0 sudo[256505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-09-30-trtjzvi.tar.xz
Sep 30 22:00:49 compute-0 sudo[256505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:00:49 compute-0 sudo[256505]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:49 compute-0 sshd-session[256504]: Received disconnect from 192.168.122.10 port 57680:11: disconnected by user
Sep 30 22:00:49 compute-0 sshd-session[256504]: Disconnected from user zuul 192.168.122.10 port 57680
Sep 30 22:00:49 compute-0 sshd-session[256501]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:49 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Session 43 logged out. Waiting for processes to exit.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Removed session 43.
Sep 30 22:00:49 compute-0 sshd-session[256530]: Accepted publickey for zuul from 192.168.122.10 port 57696 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:00:49 compute-0 systemd-logind[792]: New session 44 of user zuul.
Sep 30 22:00:49 compute-0 systemd[1]: Started Session 44 of User zuul.
Sep 30 22:00:49 compute-0 sshd-session[256530]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:00:49 compute-0 sudo[256534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Sep 30 22:00:49 compute-0 sudo[256534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:00:49 compute-0 sudo[256534]: pam_unix(sudo:session): session closed for user root
Sep 30 22:00:49 compute-0 sshd-session[256533]: Received disconnect from 192.168.122.10 port 57696:11: disconnected by user
Sep 30 22:00:49 compute-0 sshd-session[256533]: Disconnected from user zuul 192.168.122.10 port 57696
Sep 30 22:00:49 compute-0 sshd-session[256530]: pam_unix(sshd:session): session closed for user zuul
Sep 30 22:00:49 compute-0 systemd[1]: session-44.scope: Deactivated successfully.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Session 44 logged out. Waiting for processes to exit.
Sep 30 22:00:49 compute-0 systemd-logind[792]: Removed session 44.
Sep 30 22:00:50 compute-0 nova_compute[192810]: 2025-09-30 22:00:50.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:52 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 22:00:52 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 22:00:53 compute-0 nova_compute[192810]: 2025-09-30 22:00:53.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:55 compute-0 nova_compute[192810]: 2025-09-30 22:00:55.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:00:56 compute-0 podman[256565]: 2025-09-30 22:00:56.312851267 +0000 UTC m=+0.048133122 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:00:56 compute-0 podman[256564]: 2025-09-30 22:00:56.31930112 +0000 UTC m=+0.055777225 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:00:56 compute-0 podman[256563]: 2025-09-30 22:00:56.345345651 +0000 UTC m=+0.084208596 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:00:58 compute-0 nova_compute[192810]: 2025-09-30 22:00:58.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:00 compute-0 nova_compute[192810]: 2025-09-30 22:01:00.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:01 compute-0 CROND[256623]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 22:01:01 compute-0 run-parts[256626]: (/etc/cron.hourly) starting 0anacron
Sep 30 22:01:01 compute-0 run-parts[256632]: (/etc/cron.hourly) finished 0anacron
Sep 30 22:01:01 compute-0 CROND[256622]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 22:01:03 compute-0 nova_compute[192810]: 2025-09-30 22:01:03.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:05 compute-0 nova_compute[192810]: 2025-09-30 22:01:05.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:08 compute-0 nova_compute[192810]: 2025-09-30 22:01:08.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:09 compute-0 podman[256634]: 2025-09-30 22:01:09.313355129 +0000 UTC m=+0.046984802 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true)
Sep 30 22:01:09 compute-0 podman[256635]: 2025-09-30 22:01:09.328091143 +0000 UTC m=+0.052450451 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, container_name=ceilometer_agent_compute)
Sep 30 22:01:09 compute-0 podman[256633]: 2025-09-30 22:01:09.344356585 +0000 UTC m=+0.081331123 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, tcib_managed=true)
Sep 30 22:01:10 compute-0 nova_compute[192810]: 2025-09-30 22:01:10.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:11 compute-0 nova_compute[192810]: 2025-09-30 22:01:11.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:11 compute-0 nova_compute[192810]: 2025-09-30 22:01:11.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:11 compute-0 nova_compute[192810]: 2025-09-30 22:01:11.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:01:12 compute-0 nova_compute[192810]: 2025-09-30 22:01:12.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:13 compute-0 nova_compute[192810]: 2025-09-30 22:01:13.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:15 compute-0 nova_compute[192810]: 2025-09-30 22:01:15.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:18 compute-0 nova_compute[192810]: 2025-09-30 22:01:18.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:18 compute-0 nova_compute[192810]: 2025-09-30 22:01:18.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:01:18 compute-0 nova_compute[192810]: 2025-09-30 22:01:18.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:01:18 compute-0 nova_compute[192810]: 2025-09-30 22:01:18.808 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:01:18 compute-0 nova_compute[192810]: 2025-09-30 22:01:18.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:19 compute-0 podman[256699]: 2025-09-30 22:01:19.309845378 +0000 UTC m=+0.052558899 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:01:19 compute-0 podman[256700]: 2025-09-30 22:01:19.344655865 +0000 UTC m=+0.071648245 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, container_name=openstack_network_exporter)
Sep 30 22:01:20 compute-0 nova_compute[192810]: 2025-09-30 22:01:20.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:21 compute-0 nova_compute[192810]: 2025-09-30 22:01:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:21 compute-0 nova_compute[192810]: 2025-09-30 22:01:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.846 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.846 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.847 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.847 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:01:23 compute-0 nova_compute[192810]: 2025-09-30 22:01:23.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.030 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.031 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.09438705444336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.031 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.032 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.480 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.480 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.515 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.532 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.551 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:01:24 compute-0 nova_compute[192810]: 2025-09-30 22:01:24.552 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:25 compute-0 nova_compute[192810]: 2025-09-30 22:01:25.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:27 compute-0 podman[256746]: 2025-09-30 22:01:27.326887972 +0000 UTC m=+0.061649926 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:01:27 compute-0 podman[256745]: 2025-09-30 22:01:27.331886606 +0000 UTC m=+0.064854005 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 22:01:27 compute-0 podman[256747]: 2025-09-30 22:01:27.332065901 +0000 UTC m=+0.065061141 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 22:01:28 compute-0 nova_compute[192810]: 2025-09-30 22:01:28.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:30 compute-0 nova_compute[192810]: 2025-09-30 22:01:30.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:33 compute-0 nova_compute[192810]: 2025-09-30 22:01:33.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:35 compute-0 nova_compute[192810]: 2025-09-30 22:01:35.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:01:38.766 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:01:38.767 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:01:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:01:38.767 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:01:38 compute-0 nova_compute[192810]: 2025-09-30 22:01:38.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:40 compute-0 nova_compute[192810]: 2025-09-30 22:01:40.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:40 compute-0 podman[256805]: 2025-09-30 22:01:40.321997825 +0000 UTC m=+0.054542189 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:01:40 compute-0 podman[256806]: 2025-09-30 22:01:40.335739697 +0000 UTC m=+0.064622350 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:01:40 compute-0 podman[256804]: 2025-09-30 22:01:40.378421499 +0000 UTC m=+0.104947293 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:01:43 compute-0 nova_compute[192810]: 2025-09-30 22:01:43.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:45 compute-0 nova_compute[192810]: 2025-09-30 22:01:45.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:48 compute-0 nova_compute[192810]: 2025-09-30 22:01:48.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:50 compute-0 podman[256866]: 2025-09-30 22:01:50.312669453 +0000 UTC m=+0.050018567 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:01:50 compute-0 nova_compute[192810]: 2025-09-30 22:01:50.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:50 compute-0 podman[256867]: 2025-09-30 22:01:50.317470402 +0000 UTC m=+0.047478373 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, io.openshift.expose-services=)
Sep 30 22:01:53 compute-0 nova_compute[192810]: 2025-09-30 22:01:53.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:55 compute-0 nova_compute[192810]: 2025-09-30 22:01:55.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:01:58 compute-0 podman[256915]: 2025-09-30 22:01:58.317407331 +0000 UTC m=+0.044873858 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:01:58 compute-0 podman[256914]: 2025-09-30 22:01:58.317462672 +0000 UTC m=+0.047652777 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 22:01:58 compute-0 podman[256913]: 2025-09-30 22:01:58.319292628 +0000 UTC m=+0.052787815 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Sep 30 22:01:58 compute-0 nova_compute[192810]: 2025-09-30 22:01:58.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:00 compute-0 nova_compute[192810]: 2025-09-30 22:02:00.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:03 compute-0 nova_compute[192810]: 2025-09-30 22:02:03.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:05 compute-0 nova_compute[192810]: 2025-09-30 22:02:05.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:08 compute-0 nova_compute[192810]: 2025-09-30 22:02:08.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:10 compute-0 nova_compute[192810]: 2025-09-30 22:02:10.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:11 compute-0 podman[256973]: 2025-09-30 22:02:11.340005369 +0000 UTC m=+0.060982529 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:02:11 compute-0 podman[256974]: 2025-09-30 22:02:11.348646684 +0000 UTC m=+0.064752423 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 22:02:11 compute-0 podman[256972]: 2025-09-30 22:02:11.372440396 +0000 UTC m=+0.099982360 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:02:12 compute-0 nova_compute[192810]: 2025-09-30 22:02:12.553 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:12 compute-0 nova_compute[192810]: 2025-09-30 22:02:12.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:12 compute-0 nova_compute[192810]: 2025-09-30 22:02:12.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:12 compute-0 nova_compute[192810]: 2025-09-30 22:02:12.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:02:13 compute-0 nova_compute[192810]: 2025-09-30 22:02:13.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:15 compute-0 nova_compute[192810]: 2025-09-30 22:02:15.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:18 compute-0 nova_compute[192810]: 2025-09-30 22:02:18.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:19 compute-0 nova_compute[192810]: 2025-09-30 22:02:19.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:19 compute-0 nova_compute[192810]: 2025-09-30 22:02:19.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:02:19 compute-0 nova_compute[192810]: 2025-09-30 22:02:19.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:02:19 compute-0 nova_compute[192810]: 2025-09-30 22:02:19.804 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:02:20 compute-0 nova_compute[192810]: 2025-09-30 22:02:20.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:21 compute-0 podman[257031]: 2025-09-30 22:02:21.305408148 +0000 UTC m=+0.047792542 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:02:21 compute-0 podman[257032]: 2025-09-30 22:02:21.341461055 +0000 UTC m=+0.080557897 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 22:02:21 compute-0 nova_compute[192810]: 2025-09-30 22:02:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:21 compute-0 nova_compute[192810]: 2025-09-30 22:02:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:23 compute-0 nova_compute[192810]: 2025-09-30 22:02:23.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:24 compute-0 nova_compute[192810]: 2025-09-30 22:02:24.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:24 compute-0 nova_compute[192810]: 2025-09-30 22:02:24.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.991 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.991 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.991 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:25 compute-0 nova_compute[192810]: 2025-09-30 22:02:25.992 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.123 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.124 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5688MB free_disk=73.09436798095703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.124 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.125 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.365 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.501 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.535 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.535 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.568 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.618 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.654 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.691 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.693 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:02:26 compute-0 nova_compute[192810]: 2025-09-30 22:02:26.693 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:28 compute-0 nova_compute[192810]: 2025-09-30 22:02:28.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:29 compute-0 podman[257075]: 2025-09-30 22:02:29.317918181 +0000 UTC m=+0.054697280 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd)
Sep 30 22:02:29 compute-0 podman[257077]: 2025-09-30 22:02:29.31906037 +0000 UTC m=+0.046862494 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:02:29 compute-0 podman[257076]: 2025-09-30 22:02:29.336348783 +0000 UTC m=+0.057968073 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:02:30 compute-0 nova_compute[192810]: 2025-09-30 22:02:30.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:31 compute-0 nova_compute[192810]: 2025-09-30 22:02:31.689 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:02:33 compute-0 nova_compute[192810]: 2025-09-30 22:02:33.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:35 compute-0 nova_compute[192810]: 2025-09-30 22:02:35.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:02:38.767 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:02:38.767 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:02:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:02:38.768 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:02:38 compute-0 nova_compute[192810]: 2025-09-30 22:02:38.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:40 compute-0 nova_compute[192810]: 2025-09-30 22:02:40.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:42 compute-0 podman[257140]: 2025-09-30 22:02:42.337510196 +0000 UTC m=+0.062659211 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 22:02:42 compute-0 podman[257135]: 2025-09-30 22:02:42.337970808 +0000 UTC m=+0.065854781 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 22:02:42 compute-0 podman[257134]: 2025-09-30 22:02:42.357592449 +0000 UTC m=+0.094564330 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:02:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:02:43 compute-0 nova_compute[192810]: 2025-09-30 22:02:43.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:45 compute-0 nova_compute[192810]: 2025-09-30 22:02:45.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:48 compute-0 nova_compute[192810]: 2025-09-30 22:02:48.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:50 compute-0 nova_compute[192810]: 2025-09-30 22:02:50.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:52 compute-0 podman[257196]: 2025-09-30 22:02:52.304265726 +0000 UTC m=+0.045691816 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:02:52 compute-0 podman[257197]: 2025-09-30 22:02:52.30801806 +0000 UTC m=+0.048260720 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 22:02:53 compute-0 nova_compute[192810]: 2025-09-30 22:02:53.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:55 compute-0 nova_compute[192810]: 2025-09-30 22:02:55.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:02:58 compute-0 nova_compute[192810]: 2025-09-30 22:02:58.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:00 compute-0 podman[257241]: 2025-09-30 22:03:00.31333484 +0000 UTC m=+0.053527192 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:03:00 compute-0 podman[257248]: 2025-09-30 22:03:00.324458569 +0000 UTC m=+0.053578643 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:03:00 compute-0 podman[257242]: 2025-09-30 22:03:00.327926056 +0000 UTC m=+0.062444536 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3)
Sep 30 22:03:00 compute-0 nova_compute[192810]: 2025-09-30 22:03:00.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:03 compute-0 nova_compute[192810]: 2025-09-30 22:03:03.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:05 compute-0 nova_compute[192810]: 2025-09-30 22:03:05.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:08 compute-0 nova_compute[192810]: 2025-09-30 22:03:08.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:10 compute-0 nova_compute[192810]: 2025-09-30 22:03:10.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:11 compute-0 nova_compute[192810]: 2025-09-30 22:03:11.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:12 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 256697
Sep 30 22:03:12 compute-0 podman[257304]: 2025-09-30 22:03:12.434339141 +0000 UTC m=+0.049491150 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3)
Sep 30 22:03:12 compute-0 podman[257305]: 2025-09-30 22:03:12.435049289 +0000 UTC m=+0.050264540 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Sep 30 22:03:12 compute-0 podman[257306]: 2025-09-30 22:03:12.461455721 +0000 UTC m=+0.072947689 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Sep 30 22:03:14 compute-0 nova_compute[192810]: 2025-09-30 22:03:13.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:14 compute-0 nova_compute[192810]: 2025-09-30 22:03:14.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:14 compute-0 nova_compute[192810]: 2025-09-30 22:03:14.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:14 compute-0 nova_compute[192810]: 2025-09-30 22:03:14.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:03:15 compute-0 nova_compute[192810]: 2025-09-30 22:03:15.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:19 compute-0 nova_compute[192810]: 2025-09-30 22:03:19.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:19 compute-0 nova_compute[192810]: 2025-09-30 22:03:19.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:19 compute-0 nova_compute[192810]: 2025-09-30 22:03:19.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:03:19 compute-0 nova_compute[192810]: 2025-09-30 22:03:19.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:03:19 compute-0 nova_compute[192810]: 2025-09-30 22:03:19.804 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:03:20 compute-0 nova_compute[192810]: 2025-09-30 22:03:20.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:21 compute-0 nova_compute[192810]: 2025-09-30 22:03:21.798 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:22 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:60112 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 22:03:22 compute-0 nova_compute[192810]: 2025-09-30 22:03:22.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:23 compute-0 podman[257370]: 2025-09-30 22:03:23.318440095 +0000 UTC m=+0.053287045 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Sep 30 22:03:23 compute-0 podman[257369]: 2025-09-30 22:03:23.334352424 +0000 UTC m=+0.072943018 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:03:24 compute-0 nova_compute[192810]: 2025-09-30 22:03:24.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:24 compute-0 nova_compute[192810]: 2025-09-30 22:03:24.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:25 compute-0 nova_compute[192810]: 2025-09-30 22:03:25.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.845 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.846 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.846 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.847 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.987 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.988 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.09444808959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:26 compute-0 nova_compute[192810]: 2025-09-30 22:03:26.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.054 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.054 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.080 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.102 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.104 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:03:27 compute-0 nova_compute[192810]: 2025-09-30 22:03:27.104 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:29 compute-0 nova_compute[192810]: 2025-09-30 22:03:29.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:30 compute-0 nova_compute[192810]: 2025-09-30 22:03:30.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:31 compute-0 podman[257414]: 2025-09-30 22:03:31.313982041 +0000 UTC m=+0.045793208 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:03:31 compute-0 podman[257412]: 2025-09-30 22:03:31.316384961 +0000 UTC m=+0.053014939 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:03:31 compute-0 podman[257413]: 2025-09-30 22:03:31.329368166 +0000 UTC m=+0.062467516 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=iscsid)
Sep 30 22:03:34 compute-0 nova_compute[192810]: 2025-09-30 22:03:34.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:35 compute-0 nova_compute[192810]: 2025-09-30 22:03:35.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:03:38.768 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:03:38.768 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:03:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:03:38.769 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:03:39 compute-0 nova_compute[192810]: 2025-09-30 22:03:39.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:40 compute-0 nova_compute[192810]: 2025-09-30 22:03:40.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:43 compute-0 podman[257476]: 2025-09-30 22:03:43.330844364 +0000 UTC m=+0.066383974 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:03:43 compute-0 podman[257475]: 2025-09-30 22:03:43.338903236 +0000 UTC m=+0.080184430 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:03:43 compute-0 podman[257477]: 2025-09-30 22:03:43.353593064 +0000 UTC m=+0.085888153 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:03:44 compute-0 nova_compute[192810]: 2025-09-30 22:03:44.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:45 compute-0 nova_compute[192810]: 2025-09-30 22:03:45.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:49 compute-0 nova_compute[192810]: 2025-09-30 22:03:49.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:50 compute-0 nova_compute[192810]: 2025-09-30 22:03:50.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:54 compute-0 nova_compute[192810]: 2025-09-30 22:03:54.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:54 compute-0 podman[257540]: 2025-09-30 22:03:54.314897721 +0000 UTC m=+0.049998363 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, distribution-scope=public, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Sep 30 22:03:54 compute-0 podman[257539]: 2025-09-30 22:03:54.330530613 +0000 UTC m=+0.071688087 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:03:55 compute-0 nova_compute[192810]: 2025-09-30 22:03:55.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:03:59 compute-0 nova_compute[192810]: 2025-09-30 22:03:59.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:00 compute-0 nova_compute[192810]: 2025-09-30 22:04:00.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:02 compute-0 podman[257583]: 2025-09-30 22:04:02.31378264 +0000 UTC m=+0.047515281 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:04:02 compute-0 podman[257582]: 2025-09-30 22:04:02.326914059 +0000 UTC m=+0.064865376 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:04:02 compute-0 podman[257581]: 2025-09-30 22:04:02.340475279 +0000 UTC m=+0.081781060 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250923)
Sep 30 22:04:04 compute-0 nova_compute[192810]: 2025-09-30 22:04:04.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:05 compute-0 nova_compute[192810]: 2025-09-30 22:04:05.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:09 compute-0 nova_compute[192810]: 2025-09-30 22:04:09.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:10 compute-0 nova_compute[192810]: 2025-09-30 22:04:10.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:14 compute-0 nova_compute[192810]: 2025-09-30 22:04:14.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:14 compute-0 nova_compute[192810]: 2025-09-30 22:04:14.105 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:14 compute-0 podman[257646]: 2025-09-30 22:04:14.314305844 +0000 UTC m=+0.048856295 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 22:04:14 compute-0 podman[257644]: 2025-09-30 22:04:14.333283609 +0000 UTC m=+0.074918858 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:04:14 compute-0 podman[257645]: 2025-09-30 22:04:14.333523775 +0000 UTC m=+0.072038476 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:04:14 compute-0 nova_compute[192810]: 2025-09-30 22:04:14.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:14 compute-0 nova_compute[192810]: 2025-09-30 22:04:14.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:04:15 compute-0 nova_compute[192810]: 2025-09-30 22:04:15.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:16 compute-0 nova_compute[192810]: 2025-09-30 22:04:16.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:19 compute-0 nova_compute[192810]: 2025-09-30 22:04:19.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:20 compute-0 nova_compute[192810]: 2025-09-30 22:04:20.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:21 compute-0 nova_compute[192810]: 2025-09-30 22:04:21.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:21 compute-0 nova_compute[192810]: 2025-09-30 22:04:21.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:21 compute-0 nova_compute[192810]: 2025-09-30 22:04:21.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:04:21 compute-0 nova_compute[192810]: 2025-09-30 22:04:21.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:04:21 compute-0 nova_compute[192810]: 2025-09-30 22:04:21.932 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:04:22 compute-0 nova_compute[192810]: 2025-09-30 22:04:22.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:24 compute-0 nova_compute[192810]: 2025-09-30 22:04:24.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:25 compute-0 podman[257710]: 2025-09-30 22:04:25.300874304 +0000 UTC m=+0.042539557 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:04:25 compute-0 podman[257711]: 2025-09-30 22:04:25.312372142 +0000 UTC m=+0.050670421 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, release=1755695350, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Sep 30 22:04:25 compute-0 nova_compute[192810]: 2025-09-30 22:04:25.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:25 compute-0 nova_compute[192810]: 2025-09-30 22:04:25.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.876 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.876 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.876 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.877 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.995 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.996 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5713MB free_disk=73.09444808959961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.996 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:26 compute-0 nova_compute[192810]: 2025-09-30 22:04:26.997 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.210 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.211 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.270 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.287 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.288 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:04:27 compute-0 nova_compute[192810]: 2025-09-30 22:04:27.288 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:28 compute-0 nova_compute[192810]: 2025-09-30 22:04:28.288 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:29 compute-0 nova_compute[192810]: 2025-09-30 22:04:29.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:30 compute-0 nova_compute[192810]: 2025-09-30 22:04:30.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:33 compute-0 nova_compute[192810]: 2025-09-30 22:04:33.022 2 DEBUG oslo_concurrency.processutils [None req-6b86cd02-5ccd-4f89-9da4-d90bf5d34515 9765353c43d34d7a870f0c35895bd320 5fb5d4b07ed54e6cb716f880185e34d5 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Sep 30 22:04:33 compute-0 nova_compute[192810]: 2025-09-30 22:04:33.042 2 DEBUG oslo_concurrency.processutils [None req-6b86cd02-5ccd-4f89-9da4-d90bf5d34515 9765353c43d34d7a870f0c35895bd320 5fb5d4b07ed54e6cb716f880185e34d5 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Sep 30 22:04:33 compute-0 podman[257755]: 2025-09-30 22:04:33.31398855 +0000 UTC m=+0.045472221 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:04:33 compute-0 podman[257753]: 2025-09-30 22:04:33.313955309 +0000 UTC m=+0.053057690 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:04:33 compute-0 podman[257754]: 2025-09-30 22:04:33.314061162 +0000 UTC m=+0.047575603 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Sep 30 22:04:33 compute-0 nova_compute[192810]: 2025-09-30 22:04:33.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:04:34 compute-0 nova_compute[192810]: 2025-09-30 22:04:34.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:35 compute-0 nova_compute[192810]: 2025-09-30 22:04:35.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:37 compute-0 nova_compute[192810]: 2025-09-30 22:04:37.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:37.891 103867 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '86:19:25', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '8e:3a:19:78:09:10'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Sep 30 22:04:37 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:37.893 103867 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Sep 30 22:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:38.769 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:04:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:04:39 compute-0 nova_compute[192810]: 2025-09-30 22:04:39.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:40 compute-0 nova_compute[192810]: 2025-09-30 22:04:40.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.914 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.915 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:04:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:04:44 compute-0 nova_compute[192810]: 2025-09-30 22:04:44.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:44 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:04:44.895 103867 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3b817c7f-1137-4e8f-8263-8c5e6eddafa4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 22:04:45 compute-0 podman[257817]: 2025-09-30 22:04:45.315232003 +0000 UTC m=+0.048200938 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm)
Sep 30 22:04:45 compute-0 podman[257816]: 2025-09-30 22:04:45.315418337 +0000 UTC m=+0.049136391 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Sep 30 22:04:45 compute-0 podman[257815]: 2025-09-30 22:04:45.336221618 +0000 UTC m=+0.075285235 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 22:04:45 compute-0 nova_compute[192810]: 2025-09-30 22:04:45.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:49 compute-0 nova_compute[192810]: 2025-09-30 22:04:49.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:50 compute-0 nova_compute[192810]: 2025-09-30 22:04:50.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:54 compute-0 nova_compute[192810]: 2025-09-30 22:04:54.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:55 compute-0 nova_compute[192810]: 2025-09-30 22:04:55.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:04:56 compute-0 podman[257880]: 2025-09-30 22:04:56.306794437 +0000 UTC m=+0.048801613 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:04:56 compute-0 podman[257881]: 2025-09-30 22:04:56.318775047 +0000 UTC m=+0.053070410 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Sep 30 22:04:59 compute-0 nova_compute[192810]: 2025-09-30 22:04:59.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:00 compute-0 nova_compute[192810]: 2025-09-30 22:05:00.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:04 compute-0 nova_compute[192810]: 2025-09-30 22:05:04.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:04 compute-0 podman[257927]: 2025-09-30 22:05:04.316941029 +0000 UTC m=+0.052333332 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:05:04 compute-0 podman[257926]: 2025-09-30 22:05:04.316926719 +0000 UTC m=+0.056980869 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250923, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Sep 30 22:05:04 compute-0 podman[257925]: 2025-09-30 22:05:04.319466652 +0000 UTC m=+0.058486106 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd)
Sep 30 22:05:05 compute-0 nova_compute[192810]: 2025-09-30 22:05:05.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:09 compute-0 nova_compute[192810]: 2025-09-30 22:05:09.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:10 compute-0 nova_compute[192810]: 2025-09-30 22:05:10.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:14 compute-0 nova_compute[192810]: 2025-09-30 22:05:14.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:14 compute-0 sshd-session[257984]: Connection reset by 198.235.24.193 port 63556 [preauth]
Sep 30 22:05:15 compute-0 nova_compute[192810]: 2025-09-30 22:05:15.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:15 compute-0 nova_compute[192810]: 2025-09-30 22:05:15.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:15 compute-0 nova_compute[192810]: 2025-09-30 22:05:15.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:15 compute-0 nova_compute[192810]: 2025-09-30 22:05:15.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:05:16 compute-0 podman[257987]: 2025-09-30 22:05:16.32254277 +0000 UTC m=+0.052522656 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 22:05:16 compute-0 podman[257988]: 2025-09-30 22:05:16.33448387 +0000 UTC m=+0.061156753 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:05:16 compute-0 podman[257986]: 2025-09-30 22:05:16.358091861 +0000 UTC m=+0.089352129 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Sep 30 22:05:18 compute-0 nova_compute[192810]: 2025-09-30 22:05:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:19 compute-0 nova_compute[192810]: 2025-09-30 22:05:19.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:20 compute-0 nova_compute[192810]: 2025-09-30 22:05:20.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.830 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.831 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.831 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:23 compute-0 nova_compute[192810]: 2025-09-30 22:05:23.831 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:05:24 compute-0 nova_compute[192810]: 2025-09-30 22:05:24.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:25 compute-0 nova_compute[192810]: 2025-09-30 22:05:25.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:26 compute-0 nova_compute[192810]: 2025-09-30 22:05:26.807 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:26 compute-0 nova_compute[192810]: 2025-09-30 22:05:26.989 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:26 compute-0 nova_compute[192810]: 2025-09-30 22:05:26.990 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:26 compute-0 nova_compute[192810]: 2025-09-30 22:05:26.990 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:26 compute-0 nova_compute[192810]: 2025-09-30 22:05:26.991 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.124 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.125 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5704MB free_disk=73.0942153930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.125 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.126 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.209 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.209 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.291 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:05:27 compute-0 podman[258049]: 2025-09-30 22:05:27.303335617 +0000 UTC m=+0.046179548 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:05:27 compute-0 podman[258050]: 2025-09-30 22:05:27.30864803 +0000 UTC m=+0.048102026 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.322 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.323 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:05:27 compute-0 nova_compute[192810]: 2025-09-30 22:05:27.323 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:28 compute-0 nova_compute[192810]: 2025-09-30 22:05:28.304 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:28 compute-0 nova_compute[192810]: 2025-09-30 22:05:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:29 compute-0 nova_compute[192810]: 2025-09-30 22:05:29.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:30 compute-0 nova_compute[192810]: 2025-09-30 22:05:30.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:30 compute-0 nova_compute[192810]: 2025-09-30 22:05:30.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:30 compute-0 nova_compute[192810]: 2025-09-30 22:05:30.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:05:30 compute-0 nova_compute[192810]: 2025-09-30 22:05:30.816 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:05:34 compute-0 nova_compute[192810]: 2025-09-30 22:05:34.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:34 compute-0 sshd-session[258094]: Connection closed by 92.118.39.95 port 33276
Sep 30 22:05:35 compute-0 podman[258095]: 2025-09-30 22:05:35.31393607 +0000 UTC m=+0.055824359 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible)
Sep 30 22:05:35 compute-0 podman[258097]: 2025-09-30 22:05:35.337407648 +0000 UTC m=+0.072808375 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:05:35 compute-0 podman[258096]: 2025-09-30 22:05:35.337433079 +0000 UTC m=+0.076458747 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923)
Sep 30 22:05:35 compute-0 nova_compute[192810]: 2025-09-30 22:05:35.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:35 compute-0 nova_compute[192810]: 2025-09-30 22:05:35.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:05:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:05:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:05:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:05:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:05:39 compute-0 nova_compute[192810]: 2025-09-30 22:05:39.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:40 compute-0 nova_compute[192810]: 2025-09-30 22:05:40.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:44 compute-0 nova_compute[192810]: 2025-09-30 22:05:44.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:45 compute-0 nova_compute[192810]: 2025-09-30 22:05:45.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:47 compute-0 podman[258159]: 2025-09-30 22:05:47.32314645 +0000 UTC m=+0.052327002 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:05:47 compute-0 podman[258158]: 2025-09-30 22:05:47.393499914 +0000 UTC m=+0.129433505 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:05:47 compute-0 podman[258160]: 2025-09-30 22:05:47.400982792 +0000 UTC m=+0.128820930 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250923, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 22:05:49 compute-0 nova_compute[192810]: 2025-09-30 22:05:49.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:50 compute-0 nova_compute[192810]: 2025-09-30 22:05:50.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:54 compute-0 nova_compute[192810]: 2025-09-30 22:05:54.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:55 compute-0 nova_compute[192810]: 2025-09-30 22:05:55.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:05:58 compute-0 podman[258221]: 2025-09-30 22:05:58.303431534 +0000 UTC m=+0.044773613 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:05:58 compute-0 podman[258222]: 2025-09-30 22:05:58.31248912 +0000 UTC m=+0.051407599 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 22:05:59 compute-0 nova_compute[192810]: 2025-09-30 22:05:59.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:00 compute-0 nova_compute[192810]: 2025-09-30 22:06:00.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:04 compute-0 nova_compute[192810]: 2025-09-30 22:06:04.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:05 compute-0 nova_compute[192810]: 2025-09-30 22:06:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:06 compute-0 podman[258265]: 2025-09-30 22:06:06.348589702 +0000 UTC m=+0.065616035 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:06:06 compute-0 podman[258264]: 2025-09-30 22:06:06.354680554 +0000 UTC m=+0.084581240 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:06:06 compute-0 podman[258263]: 2025-09-30 22:06:06.371744392 +0000 UTC m=+0.103382241 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=multipathd)
Sep 30 22:06:09 compute-0 nova_compute[192810]: 2025-09-30 22:06:09.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:10 compute-0 nova_compute[192810]: 2025-09-30 22:06:10.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:14 compute-0 nova_compute[192810]: 2025-09-30 22:06:14.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:15 compute-0 nova_compute[192810]: 2025-09-30 22:06:15.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:16 compute-0 nova_compute[192810]: 2025-09-30 22:06:16.810 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:16 compute-0 nova_compute[192810]: 2025-09-30 22:06:16.811 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:16 compute-0 nova_compute[192810]: 2025-09-30 22:06:16.811 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:06:18 compute-0 podman[258326]: 2025-09-30 22:06:18.319592847 +0000 UTC m=+0.050346542 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:06:18 compute-0 podman[258327]: 2025-09-30 22:06:18.33050856 +0000 UTC m=+0.061246015 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:06:18 compute-0 podman[258325]: 2025-09-30 22:06:18.348631595 +0000 UTC m=+0.082959650 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:06:19 compute-0 nova_compute[192810]: 2025-09-30 22:06:19.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:19 compute-0 nova_compute[192810]: 2025-09-30 22:06:19.789 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:20 compute-0 nova_compute[192810]: 2025-09-30 22:06:20.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:23 compute-0 nova_compute[192810]: 2025-09-30 22:06:23.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:23 compute-0 nova_compute[192810]: 2025-09-30 22:06:23.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:06:23 compute-0 nova_compute[192810]: 2025-09-30 22:06:23.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:06:23 compute-0 nova_compute[192810]: 2025-09-30 22:06:23.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:06:24 compute-0 nova_compute[192810]: 2025-09-30 22:06:24.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:24 compute-0 sshd-session[258390]: Invalid user config from 80.94.95.115 port 24652
Sep 30 22:06:24 compute-0 nova_compute[192810]: 2025-09-30 22:06:24.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:24 compute-0 nova_compute[192810]: 2025-09-30 22:06:24.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:25 compute-0 nova_compute[192810]: 2025-09-30 22:06:25.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:25 compute-0 sshd-session[258390]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 22:06:25 compute-0 sshd-session[258390]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115
Sep 30 22:06:28 compute-0 sshd-session[258390]: Failed password for invalid user config from 80.94.95.115 port 24652 ssh2
Sep 30 22:06:28 compute-0 sshd-session[258390]: Connection closed by invalid user config 80.94.95.115 port 24652 [preauth]
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.835 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.836 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.836 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.836 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.961 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.962 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5698MB free_disk=73.0942153930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.962 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:28 compute-0 nova_compute[192810]: 2025-09-30 22:06:28.962 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.040 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.040 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.057 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.077 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.078 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.078 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:29 compute-0 nova_compute[192810]: 2025-09-30 22:06:29.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:29 compute-0 podman[258392]: 2025-09-30 22:06:29.29830581 +0000 UTC m=+0.039986733 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:06:29 compute-0 podman[258393]: 2025-09-30 22:06:29.304439184 +0000 UTC m=+0.044568338 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm)
Sep 30 22:06:30 compute-0 nova_compute[192810]: 2025-09-30 22:06:30.078 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:30 compute-0 nova_compute[192810]: 2025-09-30 22:06:30.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:30 compute-0 nova_compute[192810]: 2025-09-30 22:06:30.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:34 compute-0 nova_compute[192810]: 2025-09-30 22:06:34.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:35 compute-0 nova_compute[192810]: 2025-09-30 22:06:35.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:37 compute-0 podman[258436]: 2025-09-30 22:06:37.304784009 +0000 UTC m=+0.042064465 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 22:06:37 compute-0 podman[258435]: 2025-09-30 22:06:37.31122354 +0000 UTC m=+0.051718136 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250923, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:06:37 compute-0 podman[258437]: 2025-09-30 22:06:37.343605122 +0000 UTC m=+0.076321303 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:06:37 compute-0 nova_compute[192810]: 2025-09-30 22:06:37.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:06:38.770 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:06:38.771 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:06:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:06:38.771 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:06:39 compute-0 nova_compute[192810]: 2025-09-30 22:06:39.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:40 compute-0 nova_compute[192810]: 2025-09-30 22:06:40.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:06:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:06:44 compute-0 nova_compute[192810]: 2025-09-30 22:06:44.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:45 compute-0 nova_compute[192810]: 2025-09-30 22:06:45.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:49 compute-0 nova_compute[192810]: 2025-09-30 22:06:49.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:49 compute-0 podman[258498]: 2025-09-30 22:06:49.319323004 +0000 UTC m=+0.051054399 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:06:49 compute-0 podman[258497]: 2025-09-30 22:06:49.343771137 +0000 UTC m=+0.070722093 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Sep 30 22:06:49 compute-0 podman[258496]: 2025-09-30 22:06:49.350361872 +0000 UTC m=+0.089384870 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:06:50 compute-0 nova_compute[192810]: 2025-09-30 22:06:50.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:54 compute-0 nova_compute[192810]: 2025-09-30 22:06:54.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:55 compute-0 nova_compute[192810]: 2025-09-30 22:06:55.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:06:59 compute-0 nova_compute[192810]: 2025-09-30 22:06:59.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:00 compute-0 podman[258561]: 2025-09-30 22:07:00.314247854 +0000 UTC m=+0.050944477 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Sep 30 22:07:00 compute-0 podman[258560]: 2025-09-30 22:07:00.328658725 +0000 UTC m=+0.067809929 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:07:00 compute-0 nova_compute[192810]: 2025-09-30 22:07:00.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:04 compute-0 nova_compute[192810]: 2025-09-30 22:07:04.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:05 compute-0 nova_compute[192810]: 2025-09-30 22:07:05.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:08 compute-0 podman[258605]: 2025-09-30 22:07:08.312243441 +0000 UTC m=+0.054423255 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Sep 30 22:07:08 compute-0 podman[258606]: 2025-09-30 22:07:08.318970199 +0000 UTC m=+0.056628319 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 22:07:08 compute-0 podman[258607]: 2025-09-30 22:07:08.339509104 +0000 UTC m=+0.075178374 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:07:09 compute-0 nova_compute[192810]: 2025-09-30 22:07:09.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:10 compute-0 nova_compute[192810]: 2025-09-30 22:07:10.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:14 compute-0 nova_compute[192810]: 2025-09-30 22:07:14.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:15 compute-0 nova_compute[192810]: 2025-09-30 22:07:15.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:16 compute-0 nova_compute[192810]: 2025-09-30 22:07:16.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:16 compute-0 nova_compute[192810]: 2025-09-30 22:07:16.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:16 compute-0 nova_compute[192810]: 2025-09-30 22:07:16.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:07:19 compute-0 nova_compute[192810]: 2025-09-30 22:07:19.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:20 compute-0 podman[258664]: 2025-09-30 22:07:20.315273597 +0000 UTC m=+0.051793659 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:07:20 compute-0 podman[258665]: 2025-09-30 22:07:20.321673387 +0000 UTC m=+0.055406729 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=edpm, io.buildah.version=1.41.3)
Sep 30 22:07:20 compute-0 podman[258663]: 2025-09-30 22:07:20.339382121 +0000 UTC m=+0.079591825 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:07:20 compute-0 nova_compute[192810]: 2025-09-30 22:07:20.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:21 compute-0 nova_compute[192810]: 2025-09-30 22:07:21.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:24 compute-0 nova_compute[192810]: 2025-09-30 22:07:24.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:24 compute-0 nova_compute[192810]: 2025-09-30 22:07:24.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:25 compute-0 nova_compute[192810]: 2025-09-30 22:07:25.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:25 compute-0 nova_compute[192810]: 2025-09-30 22:07:25.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:25 compute-0 nova_compute[192810]: 2025-09-30 22:07:25.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:07:25 compute-0 nova_compute[192810]: 2025-09-30 22:07:25.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:07:25 compute-0 nova_compute[192810]: 2025-09-30 22:07:25.807 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:07:26 compute-0 nova_compute[192810]: 2025-09-30 22:07:26.801 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.820 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.821 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.977 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.978 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.0942153930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.978 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:29 compute-0 nova_compute[192810]: 2025-09-30 22:07:29.978 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:30 compute-0 nova_compute[192810]: 2025-09-30 22:07:30.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:30 compute-0 nova_compute[192810]: 2025-09-30 22:07:30.927 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:07:30 compute-0 nova_compute[192810]: 2025-09-30 22:07:30.927 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:07:31 compute-0 podman[258725]: 2025-09-30 22:07:31.309708674 +0000 UTC m=+0.052474456 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:07:31 compute-0 podman[258726]: 2025-09-30 22:07:31.310250167 +0000 UTC m=+0.051193553 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.378 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing inventories for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.395 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating ProviderTree inventory for provider fe423b93-de5a-41f7-97d1-9622ea46af54 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.395 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Updating inventory in ProviderTree for provider fe423b93-de5a-41f7-97d1-9622ea46af54 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.415 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing aggregate associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.432 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Refreshing trait associations for resource provider fe423b93-de5a-41f7-97d1-9622ea46af54, traits: COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.455 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.480 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.481 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:07:31 compute-0 nova_compute[192810]: 2025-09-30 22:07:31.482 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:33 compute-0 nova_compute[192810]: 2025-09-30 22:07:33.482 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:33 compute-0 nova_compute[192810]: 2025-09-30 22:07:33.482 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:07:34 compute-0 nova_compute[192810]: 2025-09-30 22:07:34.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:35 compute-0 nova_compute[192810]: 2025-09-30 22:07:35.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:07:38.772 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:07:38.773 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:07:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:07:38.773 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:07:39 compute-0 nova_compute[192810]: 2025-09-30 22:07:39.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:39 compute-0 podman[258771]: 2025-09-30 22:07:39.314539512 +0000 UTC m=+0.050032014 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:07:39 compute-0 podman[258772]: 2025-09-30 22:07:39.314652505 +0000 UTC m=+0.050145357 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:07:39 compute-0 podman[258770]: 2025-09-30 22:07:39.330658236 +0000 UTC m=+0.066825915 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:07:40 compute-0 nova_compute[192810]: 2025-09-30 22:07:40.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:44 compute-0 nova_compute[192810]: 2025-09-30 22:07:44.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:45 compute-0 nova_compute[192810]: 2025-09-30 22:07:45.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:49 compute-0 nova_compute[192810]: 2025-09-30 22:07:49.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:50 compute-0 nova_compute[192810]: 2025-09-30 22:07:50.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:51 compute-0 podman[258827]: 2025-09-30 22:07:51.320515132 +0000 UTC m=+0.057571224 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:07:51 compute-0 podman[258828]: 2025-09-30 22:07:51.338676357 +0000 UTC m=+0.065016300 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:07:51 compute-0 podman[258826]: 2025-09-30 22:07:51.376636088 +0000 UTC m=+0.116065929 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:07:54 compute-0 nova_compute[192810]: 2025-09-30 22:07:54.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:55 compute-0 nova_compute[192810]: 2025-09-30 22:07:55.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:07:59 compute-0 nova_compute[192810]: 2025-09-30 22:07:59.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:00 compute-0 nova_compute[192810]: 2025-09-30 22:08:00.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:02 compute-0 podman[258890]: 2025-09-30 22:08:02.3078589 +0000 UTC m=+0.049535981 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:08:02 compute-0 podman[258891]: 2025-09-30 22:08:02.320928178 +0000 UTC m=+0.060221269 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Sep 30 22:08:04 compute-0 nova_compute[192810]: 2025-09-30 22:08:04.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:05 compute-0 nova_compute[192810]: 2025-09-30 22:08:05.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:09 compute-0 nova_compute[192810]: 2025-09-30 22:08:09.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:10 compute-0 podman[258938]: 2025-09-30 22:08:10.328402252 +0000 UTC m=+0.058356853 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=iscsid, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:08:10 compute-0 podman[258937]: 2025-09-30 22:08:10.328468994 +0000 UTC m=+0.061818060 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250923, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:08:10 compute-0 podman[258939]: 2025-09-30 22:08:10.329141621 +0000 UTC m=+0.058311332 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:08:10 compute-0 nova_compute[192810]: 2025-09-30 22:08:10.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:14 compute-0 nova_compute[192810]: 2025-09-30 22:08:14.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:15 compute-0 nova_compute[192810]: 2025-09-30 22:08:15.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:16 compute-0 nova_compute[192810]: 2025-09-30 22:08:16.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:18 compute-0 nova_compute[192810]: 2025-09-30 22:08:18.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:18 compute-0 nova_compute[192810]: 2025-09-30 22:08:18.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:08:19 compute-0 nova_compute[192810]: 2025-09-30 22:08:19.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:20 compute-0 nova_compute[192810]: 2025-09-30 22:08:20.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:21 compute-0 nova_compute[192810]: 2025-09-30 22:08:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:22 compute-0 podman[259000]: 2025-09-30 22:08:22.309780377 +0000 UTC m=+0.045872900 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 22:08:22 compute-0 podman[259001]: 2025-09-30 22:08:22.317315235 +0000 UTC m=+0.049348337 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:08:22 compute-0 podman[258999]: 2025-09-30 22:08:22.337655305 +0000 UTC m=+0.076252091 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 22:08:24 compute-0 nova_compute[192810]: 2025-09-30 22:08:24.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:25 compute-0 nova_compute[192810]: 2025-09-30 22:08:25.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:25 compute-0 nova_compute[192810]: 2025-09-30 22:08:25.788 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:25 compute-0 nova_compute[192810]: 2025-09-30 22:08:25.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:08:25 compute-0 nova_compute[192810]: 2025-09-30 22:08:25.789 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:08:25 compute-0 nova_compute[192810]: 2025-09-30 22:08:25.806 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:08:26 compute-0 nova_compute[192810]: 2025-09-30 22:08:26.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:27 compute-0 nova_compute[192810]: 2025-09-30 22:08:27.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:29 compute-0 nova_compute[192810]: 2025-09-30 22:08:29.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.814 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.815 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.816 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.817 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.983 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.985 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5691MB free_disk=73.0942153930664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.985 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:30 compute-0 nova_compute[192810]: 2025-09-30 22:08:30.986 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.132 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.132 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.158 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.171 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.172 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:08:31 compute-0 nova_compute[192810]: 2025-09-30 22:08:31.172 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:33 compute-0 podman[259065]: 2025-09-30 22:08:33.319414976 +0000 UTC m=+0.056486147 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 22:08:33 compute-0 podman[259064]: 2025-09-30 22:08:33.31959528 +0000 UTC m=+0.058359253 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:08:34 compute-0 nova_compute[192810]: 2025-09-30 22:08:34.173 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:34 compute-0 nova_compute[192810]: 2025-09-30 22:08:34.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:34 compute-0 nova_compute[192810]: 2025-09-30 22:08:34.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:35 compute-0 nova_compute[192810]: 2025-09-30 22:08:35.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:08:38.775 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:08:38.775 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:08:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:08:38.775 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:08:39 compute-0 nova_compute[192810]: 2025-09-30 22:08:39.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:40 compute-0 nova_compute[192810]: 2025-09-30 22:08:40.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:41 compute-0 podman[259111]: 2025-09-30 22:08:41.312307515 +0000 UTC m=+0.047006858 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 22:08:41 compute-0 podman[259109]: 2025-09-30 22:08:41.312838858 +0000 UTC m=+0.053552401 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:08:41 compute-0 podman[259110]: 2025-09-30 22:08:41.343491895 +0000 UTC m=+0.081664015 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20250923)
Sep 30 22:08:42 compute-0 nova_compute[192810]: 2025-09-30 22:08:42.783 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:08:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:08:44 compute-0 nova_compute[192810]: 2025-09-30 22:08:44.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:45 compute-0 nova_compute[192810]: 2025-09-30 22:08:45.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:49 compute-0 nova_compute[192810]: 2025-09-30 22:08:49.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:50 compute-0 nova_compute[192810]: 2025-09-30 22:08:50.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:53 compute-0 podman[259171]: 2025-09-30 22:08:53.315493052 +0000 UTC m=+0.050760432 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 22:08:53 compute-0 podman[259172]: 2025-09-30 22:08:53.32338062 +0000 UTC m=+0.054417614 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 22:08:53 compute-0 podman[259170]: 2025-09-30 22:08:53.343376751 +0000 UTC m=+0.081385179 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:08:54 compute-0 nova_compute[192810]: 2025-09-30 22:08:54.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:55 compute-0 nova_compute[192810]: 2025-09-30 22:08:55.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:08:59 compute-0 nova_compute[192810]: 2025-09-30 22:08:59.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:00 compute-0 nova_compute[192810]: 2025-09-30 22:09:00.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:02 compute-0 anacron[105679]: Job `cron.weekly' started
Sep 30 22:09:02 compute-0 anacron[105679]: Job `cron.weekly' terminated
Sep 30 22:09:04 compute-0 nova_compute[192810]: 2025-09-30 22:09:04.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:04 compute-0 podman[259238]: 2025-09-30 22:09:04.326757021 +0000 UTC m=+0.057027899 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 22:09:04 compute-0 podman[259237]: 2025-09-30 22:09:04.335514461 +0000 UTC m=+0.073864342 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Sep 30 22:09:05 compute-0 nova_compute[192810]: 2025-09-30 22:09:05.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:09 compute-0 nova_compute[192810]: 2025-09-30 22:09:09.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:10 compute-0 nova_compute[192810]: 2025-09-30 22:09:10.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:12 compute-0 podman[259282]: 2025-09-30 22:09:12.331580289 +0000 UTC m=+0.056868166 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 22:09:12 compute-0 podman[259281]: 2025-09-30 22:09:12.332966334 +0000 UTC m=+0.060472746 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:09:12 compute-0 podman[259280]: 2025-09-30 22:09:12.348442472 +0000 UTC m=+0.081027871 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:09:14 compute-0 nova_compute[192810]: 2025-09-30 22:09:14.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:15 compute-0 nova_compute[192810]: 2025-09-30 22:09:15.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:17 compute-0 nova_compute[192810]: 2025-09-30 22:09:17.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:19 compute-0 nova_compute[192810]: 2025-09-30 22:09:19.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:20 compute-0 nova_compute[192810]: 2025-09-30 22:09:20.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:20 compute-0 nova_compute[192810]: 2025-09-30 22:09:20.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:20 compute-0 nova_compute[192810]: 2025-09-30 22:09:20.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:09:21 compute-0 nova_compute[192810]: 2025-09-30 22:09:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:24 compute-0 nova_compute[192810]: 2025-09-30 22:09:24.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:24 compute-0 podman[259339]: 2025-09-30 22:09:24.304025728 +0000 UTC m=+0.043674215 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:09:24 compute-0 podman[259338]: 2025-09-30 22:09:24.336632255 +0000 UTC m=+0.078417525 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:09:24 compute-0 podman[259340]: 2025-09-30 22:09:24.345661342 +0000 UTC m=+0.070198930 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:09:25 compute-0 nova_compute[192810]: 2025-09-30 22:09:25.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:26 compute-0 nova_compute[192810]: 2025-09-30 22:09:26.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:27 compute-0 nova_compute[192810]: 2025-09-30 22:09:27.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:27 compute-0 nova_compute[192810]: 2025-09-30 22:09:27.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:09:27 compute-0 nova_compute[192810]: 2025-09-30 22:09:27.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:09:27 compute-0 nova_compute[192810]: 2025-09-30 22:09:27.805 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:09:27 compute-0 nova_compute[192810]: 2025-09-30 22:09:27.806 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:29 compute-0 nova_compute[192810]: 2025-09-30 22:09:29.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:29 compute-0 nova_compute[192810]: 2025-09-30 22:09:29.802 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:30 compute-0 nova_compute[192810]: 2025-09-30 22:09:30.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.786 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.818 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.819 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.819 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.943 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.944 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5715MB free_disk=73.09465789794922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.944 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.945 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.997 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:09:31 compute-0 nova_compute[192810]: 2025-09-30 22:09:31.997 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:09:32 compute-0 nova_compute[192810]: 2025-09-30 22:09:32.109 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:09:32 compute-0 nova_compute[192810]: 2025-09-30 22:09:32.121 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:09:32 compute-0 nova_compute[192810]: 2025-09-30 22:09:32.123 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:09:32 compute-0 nova_compute[192810]: 2025-09-30 22:09:32.123 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:34 compute-0 nova_compute[192810]: 2025-09-30 22:09:34.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:35 compute-0 nova_compute[192810]: 2025-09-30 22:09:35.124 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:35 compute-0 podman[259397]: 2025-09-30 22:09:35.304996269 +0000 UTC m=+0.047707326 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Sep 30 22:09:35 compute-0 podman[259398]: 2025-09-30 22:09:35.310744083 +0000 UTC m=+0.050480366 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 22:09:35 compute-0 nova_compute[192810]: 2025-09-30 22:09:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:36 compute-0 nova_compute[192810]: 2025-09-30 22:09:36.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:09:38.776 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:09:38.776 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:09:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:09:38.776 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:09:39 compute-0 nova_compute[192810]: 2025-09-30 22:09:39.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:40 compute-0 nova_compute[192810]: 2025-09-30 22:09:40.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:43 compute-0 podman[259444]: 2025-09-30 22:09:43.320405532 +0000 UTC m=+0.049815049 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:09:43 compute-0 podman[259443]: 2025-09-30 22:09:43.325371996 +0000 UTC m=+0.057449580 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c)
Sep 30 22:09:43 compute-0 podman[259442]: 2025-09-30 22:09:43.325630473 +0000 UTC m=+0.059831950 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Sep 30 22:09:44 compute-0 nova_compute[192810]: 2025-09-30 22:09:44.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:45 compute-0 nova_compute[192810]: 2025-09-30 22:09:45.437 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:09:45 compute-0 nova_compute[192810]: 2025-09-30 22:09:45.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:49 compute-0 nova_compute[192810]: 2025-09-30 22:09:49.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:50 compute-0 nova_compute[192810]: 2025-09-30 22:09:50.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:54 compute-0 nova_compute[192810]: 2025-09-30 22:09:54.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:55 compute-0 podman[259503]: 2025-09-30 22:09:55.322511666 +0000 UTC m=+0.052880186 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Sep 30 22:09:55 compute-0 podman[259502]: 2025-09-30 22:09:55.336027474 +0000 UTC m=+0.059405179 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Sep 30 22:09:55 compute-0 podman[259501]: 2025-09-30 22:09:55.354479287 +0000 UTC m=+0.090366145 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Sep 30 22:09:55 compute-0 nova_compute[192810]: 2025-09-30 22:09:55.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:09:59 compute-0 nova_compute[192810]: 2025-09-30 22:09:59.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:00 compute-0 nova_compute[192810]: 2025-09-30 22:10:00.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:04 compute-0 nova_compute[192810]: 2025-09-30 22:10:04.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:05 compute-0 nova_compute[192810]: 2025-09-30 22:10:05.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:06 compute-0 podman[259564]: 2025-09-30 22:10:06.333468578 +0000 UTC m=+0.067033351 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:10:06 compute-0 podman[259565]: 2025-09-30 22:10:06.342507414 +0000 UTC m=+0.073495422 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git)
Sep 30 22:10:09 compute-0 nova_compute[192810]: 2025-09-30 22:10:09.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:10 compute-0 nova_compute[192810]: 2025-09-30 22:10:10.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:11 compute-0 sshd[128205]: Timeout before authentication for connection from 113.240.110.90 to 38.102.83.69, pid = 258935
Sep 30 22:10:14 compute-0 nova_compute[192810]: 2025-09-30 22:10:14.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:14 compute-0 podman[259609]: 2025-09-30 22:10:14.313588077 +0000 UTC m=+0.048948217 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:10:14 compute-0 podman[259608]: 2025-09-30 22:10:14.32211021 +0000 UTC m=+0.060012834 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:10:14 compute-0 podman[259607]: 2025-09-30 22:10:14.341149387 +0000 UTC m=+0.083094812 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=36bccb96575468ec919301205d8daa2c, managed_by=edpm_ansible, org.label-schema.build-date=20250923, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 22:10:15 compute-0 nova_compute[192810]: 2025-09-30 22:10:15.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:17 compute-0 nova_compute[192810]: 2025-09-30 22:10:17.419 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:18 compute-0 nova_compute[192810]: 2025-09-30 22:10:18.819 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:19 compute-0 nova_compute[192810]: 2025-09-30 22:10:19.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:20 compute-0 nova_compute[192810]: 2025-09-30 22:10:20.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:21 compute-0 nova_compute[192810]: 2025-09-30 22:10:21.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:22 compute-0 nova_compute[192810]: 2025-09-30 22:10:22.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:22 compute-0 nova_compute[192810]: 2025-09-30 22:10:22.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Sep 30 22:10:24 compute-0 nova_compute[192810]: 2025-09-30 22:10:24.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:25 compute-0 nova_compute[192810]: 2025-09-30 22:10:25.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:25 compute-0 nova_compute[192810]: 2025-09-30 22:10:25.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:25 compute-0 nova_compute[192810]: 2025-09-30 22:10:25.787 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Sep 30 22:10:26 compute-0 podman[259668]: 2025-09-30 22:10:26.343488077 +0000 UTC m=+0.085768650 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:10:26 compute-0 podman[259669]: 2025-09-30 22:10:26.349943698 +0000 UTC m=+0.084601690 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:10:26 compute-0 podman[259670]: 2025-09-30 22:10:26.351955559 +0000 UTC m=+0.087679258 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Sep 30 22:10:28 compute-0 nova_compute[192810]: 2025-09-30 22:10:28.801 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:28 compute-0 nova_compute[192810]: 2025-09-30 22:10:28.802 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Sep 30 22:10:28 compute-0 nova_compute[192810]: 2025-09-30 22:10:28.802 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Sep 30 22:10:28 compute-0 nova_compute[192810]: 2025-09-30 22:10:28.823 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Sep 30 22:10:28 compute-0 nova_compute[192810]: 2025-09-30 22:10:28.824 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:29 compute-0 nova_compute[192810]: 2025-09-30 22:10:29.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:29 compute-0 nova_compute[192810]: 2025-09-30 22:10:29.805 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:30 compute-0 nova_compute[192810]: 2025-09-30 22:10:30.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.821 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.822 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.822 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.986 2 WARNING nova.virt.libvirt.driver [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.987 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5709MB free_disk=73.0948486328125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:31 compute-0 nova_compute[192810]: 2025-09-30 22:10:31.988 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.045 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.046 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.072 2 DEBUG nova.compute.provider_tree [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed in ProviderTree for provider: fe423b93-de5a-41f7-97d1-9622ea46af54 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.089 2 DEBUG nova.scheduler.client.report [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Inventory has not changed for provider fe423b93-de5a-41f7-97d1-9622ea46af54 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.091 2 DEBUG nova.compute.resource_tracker [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Sep 30 22:10:32 compute-0 nova_compute[192810]: 2025-09-30 22:10:32.091 2 DEBUG oslo_concurrency.lockutils [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:32 compute-0 sshd[128205]: drop connection #0 from [113.240.110.90]:37064 on [38.102.83.69]:22 penalty: exceeded LoginGraceTime
Sep 30 22:10:33 compute-0 nova_compute[192810]: 2025-09-30 22:10:33.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:33 compute-0 nova_compute[192810]: 2025-09-30 22:10:33.788 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Sep 30 22:10:33 compute-0 nova_compute[192810]: 2025-09-30 22:10:33.817 2 DEBUG nova.compute.manager [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Sep 30 22:10:34 compute-0 nova_compute[192810]: 2025-09-30 22:10:34.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:35 compute-0 nova_compute[192810]: 2025-09-30 22:10:35.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:36 compute-0 nova_compute[192810]: 2025-09-30 22:10:36.817 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:37 compute-0 podman[259728]: 2025-09-30 22:10:37.313425691 +0000 UTC m=+0.048780334 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Sep 30 22:10:37 compute-0 podman[259729]: 2025-09-30 22:10:37.319291437 +0000 UTC m=+0.052335142 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git)
Sep 30 22:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:10:38.777 103867 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Sep 30 22:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:10:38.777 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Sep 30 22:10:38 compute-0 ovn_metadata_agent[103862]: 2025-09-30 22:10:38.777 103867 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Sep 30 22:10:38 compute-0 nova_compute[192810]: 2025-09-30 22:10:38.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:39 compute-0 nova_compute[192810]: 2025-09-30 22:10:39.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:40 compute-0 nova_compute[192810]: 2025-09-30 22:10:40.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:42 compute-0 nova_compute[192810]: 2025-09-30 22:10:42.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.916 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.917 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:43 compute-0 ceilometer_agent_compute[203619]: 2025-09-30 22:10:43.918 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Sep 30 22:10:44 compute-0 nova_compute[192810]: 2025-09-30 22:10:44.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:45 compute-0 podman[259771]: 2025-09-30 22:10:45.307337766 +0000 UTC m=+0.048808664 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Sep 30 22:10:45 compute-0 podman[259773]: 2025-09-30 22:10:45.311673364 +0000 UTC m=+0.045514521 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 22:10:45 compute-0 podman[259772]: 2025-09-30 22:10:45.33743591 +0000 UTC m=+0.075432161 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0)
Sep 30 22:10:45 compute-0 nova_compute[192810]: 2025-09-30 22:10:45.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:45 compute-0 nova_compute[192810]: 2025-09-30 22:10:45.799 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Sep 30 22:10:49 compute-0 nova_compute[192810]: 2025-09-30 22:10:49.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:50 compute-0 nova_compute[192810]: 2025-09-30 22:10:50.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:54 compute-0 nova_compute[192810]: 2025-09-30 22:10:54.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:55 compute-0 nova_compute[192810]: 2025-09-30 22:10:55.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:10:57 compute-0 podman[259836]: 2025-09-30 22:10:57.34102109 +0000 UTC m=+0.065611715 container health_status f859efe91c592c6841805117e134e0beb979e69c31622637c3a15d5be97cf110 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=36bccb96575468ec919301205d8daa2c, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 22:10:57 compute-0 podman[259835]: 2025-09-30 22:10:57.35417402 +0000 UTC m=+0.082001516 container health_status b178f15cf6c94a5e4a66a1bf611f06d08440bfd57f8358f4ed2bc5f5a1de6c84 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '509889af80292464c80ed237d31a2912cfbfae7a7fc6a98e3d5e94fcf1d4e04a'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 22:10:57 compute-0 podman[259834]: 2025-09-30 22:10:57.41128311 +0000 UTC m=+0.136028279 container health_status aa7541fef13a09b66063e5c74d699303fc7f8a144d20699f78a2bd075f242660 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.license=GPLv2, tcib_build_tag=36bccb96575468ec919301205d8daa2c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Sep 30 22:10:59 compute-0 nova_compute[192810]: 2025-09-30 22:10:59.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:00 compute-0 nova_compute[192810]: 2025-09-30 22:11:00.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:04 compute-0 nova_compute[192810]: 2025-09-30 22:11:04.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:05 compute-0 nova_compute[192810]: 2025-09-30 22:11:05.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:08 compute-0 podman[207069]: time="2025-09-30T22:11:08Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 22:11:08 compute-0 podman[207069]: @ - - [30/Sep/2025:22:11:08 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 25332 "" "Go-http-client/1.1"
Sep 30 22:11:08 compute-0 podman[259896]: 2025-09-30 22:11:08.308099111 +0000 UTC m=+0.046261165 container health_status c0485d69635244e1e2bf6daa610f75e2b3fe2395d21c22a2e9ee176c38fa9cf4 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Sep 30 22:11:08 compute-0 podman[259897]: 2025-09-30 22:11:08.317227449 +0000 UTC m=+0.052374767 container health_status c720b73bf3dc5949557e8a4354f04d50be56053cc1bfab12f00bfa9c2b1ece3c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:11:09 compute-0 sshd-session[259941]: Accepted publickey for zuul from 192.168.122.10 port 40032 ssh2: ECDSA SHA256:SmCicXXyU0CyMnob1MNtb+B3Td3Ord5lbeuM/VGGA5o
Sep 30 22:11:09 compute-0 systemd-logind[792]: New session 45 of user zuul.
Sep 30 22:11:09 compute-0 systemd[1]: Started Session 45 of User zuul.
Sep 30 22:11:09 compute-0 sshd-session[259941]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 22:11:09 compute-0 nova_compute[192810]: 2025-09-30 22:11:09.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:09 compute-0 sudo[259945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 22:11:09 compute-0 sudo[259945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 22:11:10 compute-0 nova_compute[192810]: 2025-09-30 22:11:10.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:13 compute-0 ovs-vsctl[260116]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 22:11:14 compute-0 nova_compute[192810]: 2025-09-30 22:11:14.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:14 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 22:11:14 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 22:11:14 compute-0 virtqemud[192233]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 22:11:15 compute-0 crontab[260534]: (root) LIST (root)
Sep 30 22:11:15 compute-0 nova_compute[192810]: 2025-09-30 22:11:15.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:16 compute-0 podman[260609]: 2025-09-30 22:11:16.343066226 +0000 UTC m=+0.067756441 container health_status caaee5aa49dcc02345653070f30955af6b0321bedb00789a18cdcb016f1decf4 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 22:11:16 compute-0 podman[260605]: 2025-09-30 22:11:16.343163178 +0000 UTC m=+0.069525714 container health_status 5bef6b45fe2e651edb9ea94d2c9f47aac4adef45cd81c398adbaf49e8d583581 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250923, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Sep 30 22:11:16 compute-0 podman[260608]: 2025-09-30 22:11:16.363126076 +0000 UTC m=+0.090346444 container health_status bdb80554fad70f60ceadf4a142f255c4c6bd0d3fa19893cfc461170d254dc672 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=36bccb96575468ec919301205d8daa2c, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250923, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 22:11:17 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 22:11:17 compute-0 systemd[1]: Started Hostname Service.
Sep 30 22:11:19 compute-0 nova_compute[192810]: 2025-09-30 22:11:19.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:20 compute-0 nova_compute[192810]: 2025-09-30 22:11:20.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Sep 30 22:11:20 compute-0 nova_compute[192810]: 2025-09-30 22:11:20.787 2 DEBUG oslo_service.periodic_task [None req-16241b1e-00d4-4853-9ca9-56481cccddca - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
